Feb 17 17:47:26 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 17:47:26 crc restorecon[4713]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:26 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:27 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:47:28 crc restorecon[4713]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 17:47:28 crc kubenswrapper[4762]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.803882 4762 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810464 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810495 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810507 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810519 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810528 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810537 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810546 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810554 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810563 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810571 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810579 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810587 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810595 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810602 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810610 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810618 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810657 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810668 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810677 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810687 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810695 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810703 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810718 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810727 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810735 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810743 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810751 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810759 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810767 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810775 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810783 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810791 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810799 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810807 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810814 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810822 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810832 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810839 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810847 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810855 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810862 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810872 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810880 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810888 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810897 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810904 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810913 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810923 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810934 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810945 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810954 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810962 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810970 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810978 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810986 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.810993 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811001 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811008 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811016 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811024 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811032 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811039 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811047 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811055 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811062 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811072 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811080 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811089 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811096 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811104 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.811111 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811248 4762 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811264 4762 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811279 4762 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811290 4762 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811302 4762 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811311 4762 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811323 4762 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811345 4762 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811367 4762 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811386 4762 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811399 4762 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811412 4762 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811423 4762 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811434 4762 flags.go:64] FLAG: --cgroup-root="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811446 4762 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811457 4762 flags.go:64] FLAG: --client-ca-file="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811468 4762 flags.go:64] FLAG: --cloud-config="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811477 4762 flags.go:64] FLAG: --cloud-provider="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811488 4762 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811515 4762 flags.go:64] FLAG: --cluster-domain="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811531 4762 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811544 4762 flags.go:64] FLAG: --config-dir="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811555 4762 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811568 4762 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811583 4762 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811658 4762 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811669 4762 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811681 4762 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811691 4762 flags.go:64] FLAG: --contention-profiling="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811701 4762 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811710 4762 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811720 4762 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811729 4762 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811755 4762 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811765 4762 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811774 4762 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811783 4762 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811792 4762 flags.go:64] FLAG: --enable-server="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811801 4762 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811815 4762 flags.go:64] FLAG: --event-burst="100" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811826 4762 flags.go:64] FLAG: --event-qps="50" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811835 4762 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811844 4762 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811854 4762 flags.go:64] FLAG: --eviction-hard="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811865 4762 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811874 4762 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811885 4762 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811894 4762 flags.go:64] FLAG: --eviction-soft="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811903 4762 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811912 4762 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811921 4762 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811930 4762 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811939 4762 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811948 4762 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811957 4762 flags.go:64] FLAG: --feature-gates="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811967 4762 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811977 4762 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811986 4762 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.811995 4762 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812004 4762 flags.go:64] FLAG: --healthz-port="10248" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812015 4762 flags.go:64] FLAG: --help="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812024 4762 flags.go:64] FLAG: --hostname-override="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812033 4762 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812042 4762 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812051 4762 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812059 4762 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812068 4762 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812077 4762 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812086 4762 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812095 4762 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812104 4762 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812113 4762 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812123 4762 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812132 4762 flags.go:64] FLAG: --kube-reserved="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812141 4762 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812150 4762 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812159 4762 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812168 4762 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812177 4762 flags.go:64] FLAG: --lock-file="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812187 4762 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812196 4762 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812205 4762 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812219 4762 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812228 4762 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812238 4762 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812247 4762 flags.go:64] FLAG: --logging-format="text" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812256 4762 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812266 4762 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812275 4762 flags.go:64] FLAG: --manifest-url="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812284 4762 flags.go:64] FLAG: --manifest-url-header="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812295 4762 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812304 4762 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812315 4762 flags.go:64] FLAG: --max-pods="110" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812324 4762 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812334 4762 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812343 4762 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812353 4762 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812362 4762 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812371 4762 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812381 4762 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812400 4762 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812410 4762 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812420 4762 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812430 4762 flags.go:64] FLAG: --pod-cidr="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812438 4762 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812452 4762 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812461 4762 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812470 4762 flags.go:64] FLAG: --pods-per-core="0" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812479 4762 flags.go:64] FLAG: --port="10250" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812489 4762 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812498 4762 flags.go:64] FLAG: --provider-id="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812506 4762 flags.go:64] FLAG: --qos-reserved="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812515 4762 flags.go:64] FLAG: --read-only-port="10255" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812524 4762 flags.go:64] FLAG: --register-node="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812533 4762 flags.go:64] FLAG: --register-schedulable="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812543 4762 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812557 4762 flags.go:64] FLAG: --registry-burst="10" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812566 4762 flags.go:64] FLAG: --registry-qps="5" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812576 4762 flags.go:64] FLAG: --reserved-cpus="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812584 4762 flags.go:64] FLAG: --reserved-memory="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812595 4762 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812604 4762 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812613 4762 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812645 4762 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812654 4762 flags.go:64] FLAG: --runonce="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812663 4762 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812673 4762 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812682 4762 flags.go:64] FLAG: --seccomp-default="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812691 4762 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812700 4762 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812709 4762 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812719 4762 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812731 4762 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812745 4762 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812770 4762 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812782 4762 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812794 4762 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812807 4762 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812819 4762 flags.go:64] FLAG: --system-cgroups="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812830 4762 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812848 4762 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812860 4762 flags.go:64] FLAG: --tls-cert-file="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812872 4762 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812886 4762 flags.go:64] FLAG: --tls-min-version="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812898 4762 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812909 4762 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812921 4762 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812932 4762 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812942 4762 flags.go:64] FLAG: --v="2" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812956 4762 flags.go:64] FLAG: --version="false" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812969 4762 flags.go:64] FLAG: --vmodule="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812985 4762 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.812997 4762 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813240 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813257 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813292 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813304 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813315 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813326 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813337 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813347 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813357 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813367 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813377 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813387 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813397 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813406 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813416 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813426 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813436 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813446 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813456 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813466 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813478 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813488 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813499 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813513 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813526 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813540 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813553 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813565 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813576 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813586 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813596 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813605 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813615 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813660 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813673 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813683 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813693 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813703 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813713 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813726 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813738 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813748 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813758 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813768 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813778 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813788 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813798 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813807 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813817 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813828 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813838 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813848 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813864 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813874 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813884 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813894 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813903 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813911 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813920 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813929 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813938 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813947 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813956 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813965 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813975 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813984 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.813993 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.814005 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.814016 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.814027 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.814038 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.814068 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.821843 4762 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.821883 4762 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821942 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821950 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821955 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821960 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821966 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821971 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821975 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821979 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821983 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821988 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821992 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.821997 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822000 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822004 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822007 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822011 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822015 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822018 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822022 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822025 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822029 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822033 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822037 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822040 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822043 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822047 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822051 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822054 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822057 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822061 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822066 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822070 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822074 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822077 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822082 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822085 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822088 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822093 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822098 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822101 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822105 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822108 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822112 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822116 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822120 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822123 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822126 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822130 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822134 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822137 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822140 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822144 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822147 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822151 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822155 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822158 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822162 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822165 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822169 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822172 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822176 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822179 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822183 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822188 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822192 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822197 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822201 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822205 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822210 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822214 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822219 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.822225 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822331 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822339 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822345 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822349 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822353 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822357 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822361 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822364 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822368 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822372 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822376 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822380 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822384 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822388 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822391 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822394 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822398 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822402 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822405 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822409 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822412 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822416 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822421 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822425 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822429 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822433 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822436 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822440 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822444 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822448 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822452 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822455 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822459 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822463 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822467 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822471 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822475 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822479 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822482 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822486 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822489 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822493 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822496 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822502 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822505 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822508 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822512 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822516 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822519 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822523 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822526 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822529 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822533 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822537 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822541 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822545 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822548 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822552 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822555 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822559 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822562 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822566 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822570 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822575 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822579 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822582 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822585 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822589 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822594 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822597 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.822602 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.822608 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.822797 4762 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.827359 4762 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.827448 4762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.829371 4762 server.go:997] "Starting client certificate rotation" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.829410 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.829700 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-30 15:41:09.103689074 +0000 UTC Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.829904 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.854097 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.856552 4762 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.858382 4762 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.875330 4762 log.go:25] "Validated CRI v1 runtime API" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.911394 4762 log.go:25] "Validated CRI v1 image API" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.913170 4762 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.920154 4762 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-17-43-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.920204 4762 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.942206 4762 manager.go:217] Machine: {Timestamp:2026-02-17 17:47:28.939254136 +0000 UTC m=+0.584172196 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1dc8183f-0bbf-41f8-ae92-b64e8a8697b3 BootID:60e49c4c-5e4b-4bf6-9895-1e12c94f3d77 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:76:37:6c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:76:37:6c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:38:9b:2c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:32:7a:1f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ab:1e:58 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:4e:80:4d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:62:5c:7a:b9:a9:54 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:73:71:0a:7d:24 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.942478 4762 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.942660 4762 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.943001 4762 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.943241 4762 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.943298 4762 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.943603 4762 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.943615 4762 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.944233 4762 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.944277 4762 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.945015 4762 state_mem.go:36] "Initialized new in-memory state store" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.945138 4762 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.953416 4762 kubelet.go:418] "Attempting to sync node with API server" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.953447 4762 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.953479 4762 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.953497 4762 kubelet.go:324] "Adding apiserver pod source" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.953593 4762 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.960838 4762 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.961700 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.961776 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.961897 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.962003 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.962023 4762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.964253 4762 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966549 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966590 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966610 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966653 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966675 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966688 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966700 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966720 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966733 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966748 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966779 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.966792 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.969105 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.969830 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.969925 4762 server.go:1280] "Started kubelet" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.977958 4762 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.978017 4762 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.979195 4762 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 17:47:28 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.979815 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.979861 4762 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.980157 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 12:19:51.023058765 +0000 UTC Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.980289 4762 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.980305 4762 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.980299 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.980350 4762 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.980729 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.980071 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189519d92b910351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:47:28.969679697 +0000 UTC m=+0.614597717,LastTimestamp:2026-02-17 17:47:28.969679697 +0000 UTC m=+0.614597717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 17:47:28 crc kubenswrapper[4762]: W0217 17:47:28.981728 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:28 crc kubenswrapper[4762]: E0217 17:47:28.981854 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.981905 4762 factory.go:55] Registering systemd factory Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.981955 4762 factory.go:221] Registration of the systemd container factory successfully Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.983225 4762 factory.go:153] Registering CRI-O factory Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.983254 4762 factory.go:221] Registration of the crio container factory successfully Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.983328 4762 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.983362 4762 factory.go:103] Registering Raw factory Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.983382 4762 manager.go:1196] Started watching for new ooms in manager Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.984206 4762 manager.go:319] Starting recovery of all containers Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.984466 4762 server.go:460] "Adding debug handlers to kubelet server" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991385 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991535 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991597 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991684 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991752 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991819 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991875 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991932 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.991992 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.992055 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.992114 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.992172 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.992232 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.992296 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993336 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993374 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993422 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993454 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993493 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.993517 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.996920 4762 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997021 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997050 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997070 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997097 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997117 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997145 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997173 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997205 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997233 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997256 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997276 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997299 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997322 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997341 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997394 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997414 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997441 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997463 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997484 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997507 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997528 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997554 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997575 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997597 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997662 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997798 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997903 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997919 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997939 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997955 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997969 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.997986 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998009 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998026 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998045 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998066 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998081 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998098 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998111 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998122 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998138 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998151 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998168 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998181 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998198 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998214 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998227 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998244 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998258 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998269 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 17:47:28 crc kubenswrapper[4762]: I0217 17:47:28.998286 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998299 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998316 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998328 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998343 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998358 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998372 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998389 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998404 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998415 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998434 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998445 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998460 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998474 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998488 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998502 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998517 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998532 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998544 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998558 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998574 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998590 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998605 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998616 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998646 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998661 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998675 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998689 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998704 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998722 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998743 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998757 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998775 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998790 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998820 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998836 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998857 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998878 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998894 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998913 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998929 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998945 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998963 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998976 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.998992 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999005 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999021 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999032 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999046 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999058 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999070 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999084 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999097 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999111 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999125 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999139 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999154 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999166 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999183 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999200 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999217 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999242 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999259 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999279 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999293 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999307 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999322 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999336 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999353 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999366 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999378 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999394 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999415 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999428 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999448 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999463 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999477 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999489 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999501 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999515 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999529 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999544 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999555 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999569 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999583 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999595 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999708 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999721 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999732 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999746 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999759 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999772 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999784 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999799 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999814 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:28.999826 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000146 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000224 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000244 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000271 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000292 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000317 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000334 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000352 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000377 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000392 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000413 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000431 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000446 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000464 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000479 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000500 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000515 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000531 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000551 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000569 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000584 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000608 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000640 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000662 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000692 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000708 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000729 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000746 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000767 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000783 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000799 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000819 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000834 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000854 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000869 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000883 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000916 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000933 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000956 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000972 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.000986 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.001004 4762 reconstruct.go:97] "Volume reconstruction finished" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.001015 4762 reconciler.go:26] "Reconciler: start to sync state" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.006225 4762 manager.go:324] Recovery completed Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.019410 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.021147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.021190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.021200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.022112 4762 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.022136 4762 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.022158 4762 state_mem.go:36] "Initialized new in-memory state store" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.031799 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.034473 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.034514 4762 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.034563 4762 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.034614 4762 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.038021 4762 policy_none.go:49] "None policy: Start" Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.038011 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.038102 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.038885 4762 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.038914 4762 state_mem.go:35] "Initializing new in-memory state store" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.081429 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.108802 4762 manager.go:334] "Starting Device Plugin manager" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.108858 4762 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.108876 4762 server.go:79] "Starting device plugin registration server" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.109452 4762 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.109479 4762 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.109810 4762 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.109995 4762 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.110005 4762 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.119851 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.135442 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.135556 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.136890 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.136925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.136937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.137107 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.137450 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.137512 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138328 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138441 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138475 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.138503 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139923 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.139966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140094 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140211 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140241 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140954 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.140964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141075 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141241 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141277 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141900 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.141911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.142141 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.142172 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.142333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.142362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.142370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.143321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.143343 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.143360 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.181332 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204112 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204136 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204152 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204169 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204187 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204223 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204254 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204295 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204322 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204344 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204365 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.204670 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.210256 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.211175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.211202 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.211215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.211246 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.211575 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.305994 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306071 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306129 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306156 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306172 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306196 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306234 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306283 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306329 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306231 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306300 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306308 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306382 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306389 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306429 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306464 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306497 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306514 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306576 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306606 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306649 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306662 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.306804 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.411970 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.413290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.413341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.413353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.413386 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.413935 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.461605 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.470847 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.475729 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.505472 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-78611d72c712b7b06acd11b876107c7f6ee5c45f01eee1d47250a735396816b2 WatchSource:0}: Error finding container 78611d72c712b7b06acd11b876107c7f6ee5c45f01eee1d47250a735396816b2: Status 404 returned error can't find the container with id 78611d72c712b7b06acd11b876107c7f6ee5c45f01eee1d47250a735396816b2 Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.508164 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1c45ee0544c762b69006b6b811a0a27d1ca0bf20a80399705be6ae3172976ab4 WatchSource:0}: Error finding container 1c45ee0544c762b69006b6b811a0a27d1ca0bf20a80399705be6ae3172976ab4: Status 404 returned error can't find the container with id 1c45ee0544c762b69006b6b811a0a27d1ca0bf20a80399705be6ae3172976ab4 Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.515416 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b38a61b9a4ef4a1bcfd187e970187471f0123e829bc33bd795fe38eb52e12122 WatchSource:0}: Error finding container b38a61b9a4ef4a1bcfd187e970187471f0123e829bc33bd795fe38eb52e12122: Status 404 returned error can't find the container with id b38a61b9a4ef4a1bcfd187e970187471f0123e829bc33bd795fe38eb52e12122 Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.515571 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.522150 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.533598 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-df459b5b8dd147dc73c940bc7b11f91c3254c666e64657638b9ef503a8b47184 WatchSource:0}: Error finding container df459b5b8dd147dc73c940bc7b11f91c3254c666e64657638b9ef503a8b47184: Status 404 returned error can't find the container with id df459b5b8dd147dc73c940bc7b11f91c3254c666e64657638b9ef503a8b47184 Feb 17 17:47:29 crc kubenswrapper[4762]: W0217 17:47:29.545530 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c662004ab5a424030dd1c3abb115cd2b0d296c5e210aa1b965ccb9f6a763c4ff WatchSource:0}: Error finding container c662004ab5a424030dd1c3abb115cd2b0d296c5e210aa1b965ccb9f6a763c4ff: Status 404 returned error can't find the container with id c662004ab5a424030dd1c3abb115cd2b0d296c5e210aa1b965ccb9f6a763c4ff Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.582135 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.814835 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.816266 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.816322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.816334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.816365 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: E0217 17:47:29.816886 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.971246 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:29 crc kubenswrapper[4762]: I0217 17:47:29.980410 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:43:16.833660389 +0000 UTC Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.039126 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c662004ab5a424030dd1c3abb115cd2b0d296c5e210aa1b965ccb9f6a763c4ff"} Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.040466 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"df459b5b8dd147dc73c940bc7b11f91c3254c666e64657638b9ef503a8b47184"} Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.041575 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b38a61b9a4ef4a1bcfd187e970187471f0123e829bc33bd795fe38eb52e12122"} Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.042398 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c45ee0544c762b69006b6b811a0a27d1ca0bf20a80399705be6ae3172976ab4"} Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.043203 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"78611d72c712b7b06acd11b876107c7f6ee5c45f01eee1d47250a735396816b2"} Feb 17 17:47:30 crc kubenswrapper[4762]: W0217 17:47:30.187553 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.187671 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:30 crc kubenswrapper[4762]: W0217 17:47:30.192318 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.192473 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:30 crc kubenswrapper[4762]: W0217 17:47:30.269687 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.269763 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.383800 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Feb 17 17:47:30 crc kubenswrapper[4762]: W0217 17:47:30.403333 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.403467 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.617650 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.619081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.619113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.619122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.619145 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.619523 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.953098 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:47:30 crc kubenswrapper[4762]: E0217 17:47:30.954223 4762 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.970965 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:30 crc kubenswrapper[4762]: I0217 17:47:30.981110 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:38:27.003950037 +0000 UTC Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.049862 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.049919 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.049936 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.050190 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.050214 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.051853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.051902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.051918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.053078 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163" exitCode=0 Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.053386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.053583 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.055855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.055917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.055942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.056505 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7e6ba3478adf50ab6a90df5b98191da3d993269d68c497b267268d324df81e04" exitCode=0 Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.056656 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7e6ba3478adf50ab6a90df5b98191da3d993269d68c497b267268d324df81e04"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.056795 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.057790 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.057936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.057991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058849 4762 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5" exitCode=0 Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.058956 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.059076 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.060249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.060446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.060585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.061064 4762 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64" exitCode=0 Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.061114 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64"} Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.061153 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.062583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.062667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.062692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.479208 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:31 crc kubenswrapper[4762]: W0217 17:47:31.817229 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:31 crc kubenswrapper[4762]: E0217 17:47:31.817477 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.970892 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:31 crc kubenswrapper[4762]: I0217 17:47:31.981470 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:09:01.275294124 +0000 UTC Feb 17 17:47:31 crc kubenswrapper[4762]: E0217 17:47:31.985250 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Feb 17 17:47:32 crc kubenswrapper[4762]: W0217 17:47:32.021342 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:32 crc kubenswrapper[4762]: E0217 17:47:32.021412 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.065795 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a005f5eed40fd8e785788090bdd7279be3f25b9591d2263bc37ece04cb06b681" exitCode=0 Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.065859 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a005f5eed40fd8e785788090bdd7279be3f25b9591d2263bc37ece04cb06b681"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.065951 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.066634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.066660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.066669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.068144 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.068140 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.069418 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.069439 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.069448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071106 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071136 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071146 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071166 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.071865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073907 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073942 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073952 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073960 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498"} Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073943 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.073995 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074812 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074814 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.074987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: W0217 17:47:32.135018 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:32 crc kubenswrapper[4762]: E0217 17:47:32.135132 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:32 crc kubenswrapper[4762]: W0217 17:47:32.188470 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 17 17:47:32 crc kubenswrapper[4762]: E0217 17:47:32.188570 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.219650 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.221042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.221079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.221092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.221119 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:32 crc kubenswrapper[4762]: E0217 17:47:32.221802 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.736731 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:32 crc kubenswrapper[4762]: I0217 17:47:32.982283 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:26:50.201350126 +0000 UTC Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082107 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d78ede3f2035a71208f0d524d5e5eb41d0d740d189f893c233485978c022e97e"} Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082163 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082665 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d78ede3f2035a71208f0d524d5e5eb41d0d740d189f893c233485978c022e97e" exitCode=0 Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082771 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082815 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082853 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082883 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082817 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.082987 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.083412 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.083448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.083462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084953 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.084965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.085086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.237326 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:33 crc kubenswrapper[4762]: I0217 17:47:33.983341 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 04:12:51.292753057 +0000 UTC Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.090650 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a67898566d7f13f83ccadad36a5f031fb4b9adb80d5ad9ef56867c9e49184cdc"} Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.090708 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ca0e2fc0fc5a047f7151c817fd77e0cfc66094be521d3c03b6f0298c9824f749"} Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.090722 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c6e88f3e3d2989bee06b0b293548720bf361ca5a0d8e305d39fb6a2cca89b09"} Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.090722 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.090773 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.091859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.091893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.091906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.966014 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.978909 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.979219 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.980314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.980348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.980359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.983545 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 15:45:25.712977218 +0000 UTC Feb 17 17:47:34 crc kubenswrapper[4762]: I0217 17:47:34.985591 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099693 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b93cc120f0b75ba6ff60977305a44f6a0177cb9d1ca8ece0f183da54def27a2"} Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06864bdb2d82f090f760ccd3f94a7f2bb84c653d1df2a6a90dabbf75a9cefca6"} Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099796 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099811 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099916 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.099979 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101116 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.101422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.421904 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.423808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.423848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.423866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.423894 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:35 crc kubenswrapper[4762]: I0217 17:47:35.984299 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 15:09:03.882172699 +0000 UTC Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.025423 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.102527 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.103298 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.103327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.103339 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.849062 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.849265 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.850686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.850735 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.850749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:36 crc kubenswrapper[4762]: I0217 17:47:36.984850 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 07:58:03.690915655 +0000 UTC Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.014204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.014403 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.015772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.015793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.015801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.104849 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.105914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.105962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.105974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.230587 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.230900 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.232232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.232299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.232318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.985860 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 06:34:36.255569913 +0000 UTC Feb 17 17:47:37 crc kubenswrapper[4762]: I0217 17:47:37.987068 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.109726 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.111012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.111050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.111063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.231871 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.232124 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.233919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.234000 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.234013 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:38 crc kubenswrapper[4762]: I0217 17:47:38.986592 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 05:11:42.516951804 +0000 UTC Feb 17 17:47:39 crc kubenswrapper[4762]: E0217 17:47:39.120675 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 17:47:39 crc kubenswrapper[4762]: I0217 17:47:39.987047 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 05:45:48.273284829 +0000 UTC Feb 17 17:47:40 crc kubenswrapper[4762]: I0217 17:47:40.987391 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 14:24:41.954029221 +0000 UTC Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.232506 4762 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.232664 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.483244 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.483385 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.484494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.484541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.484663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:41 crc kubenswrapper[4762]: I0217 17:47:41.988223 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:10:11.528166294 +0000 UTC Feb 17 17:47:42 crc kubenswrapper[4762]: I0217 17:47:42.971260 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 17:47:42 crc kubenswrapper[4762]: I0217 17:47:42.988498 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 02:17:32.532423503 +0000 UTC Feb 17 17:47:43 crc kubenswrapper[4762]: E0217 17:47:43.002944 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189519d92b910351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:47:28.969679697 +0000 UTC m=+0.614597717,LastTimestamp:2026-02-17 17:47:28.969679697 +0000 UTC m=+0.614597717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.091881 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.091948 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.097459 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.097531 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.241679 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]log ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]etcd ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Feb 17 17:47:43 crc kubenswrapper[4762]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Feb 17 17:47:43 crc kubenswrapper[4762]: [-]poststarthook/crd-informer-synced failed: reason withheld Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 17 17:47:43 crc kubenswrapper[4762]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 17 17:47:43 crc kubenswrapper[4762]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]autoregister-completion ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 17 17:47:43 crc kubenswrapper[4762]: livez check failed Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.241739 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:47:43 crc kubenswrapper[4762]: I0217 17:47:43.989487 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:40:09.913959572 +0000 UTC Feb 17 17:47:44 crc kubenswrapper[4762]: I0217 17:47:44.990523 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 06:24:05.748128152 +0000 UTC Feb 17 17:47:45 crc kubenswrapper[4762]: I0217 17:47:45.991660 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 05:05:00.765110004 +0000 UTC Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.058223 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.059263 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.061236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.061301 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.061326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.073105 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.130402 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.131532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.131585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.131597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:46 crc kubenswrapper[4762]: I0217 17:47:46.992743 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:32:43.636703592 +0000 UTC Feb 17 17:47:47 crc kubenswrapper[4762]: I0217 17:47:47.993833 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 19:15:25.527062759 +0000 UTC Feb 17 17:47:48 crc kubenswrapper[4762]: E0217 17:47:48.067892 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071180 4762 trace.go:236] Trace[383781967]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:47:35.903) (total time: 12167ms): Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[383781967]: ---"Objects listed" error: 12167ms (17:47:48.071) Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[383781967]: [12.167256003s] [12.167256003s] END Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071228 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071472 4762 trace.go:236] Trace[1686178812]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:47:37.474) (total time: 10597ms): Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[1686178812]: ---"Objects listed" error: 10597ms (17:47:48.071) Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[1686178812]: [10.597095932s] [10.597095932s] END Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071516 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071828 4762 trace.go:236] Trace[924408487]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:47:37.642) (total time: 10429ms): Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[924408487]: ---"Objects listed" error: 10429ms (17:47:48.071) Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[924408487]: [10.429642156s] [10.429642156s] END Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.071861 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.075265 4762 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.076473 4762 trace.go:236] Trace[863131933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:47:35.445) (total time: 12630ms): Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[863131933]: ---"Objects listed" error: 12630ms (17:47:48.076) Feb 17 17:47:48 crc kubenswrapper[4762]: Trace[863131933]: [12.630723157s] [12.630723157s] END Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.076496 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 17:47:48 crc kubenswrapper[4762]: E0217 17:47:48.077569 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.090562 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.109845 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54904->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.109947 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54904->192.168.126.11:17697: read: connection reset by peer" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.137305 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.139207 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0" exitCode=255 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.139249 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0"} Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.236904 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.242534 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.242804 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.247326 4762 scope.go:117] "RemoveContainer" containerID="8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.965704 4762 apiserver.go:52] "Watching apiserver" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.982502 4762 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.982870 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.983272 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.983373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.983841 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.983892 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.984177 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:48 crc kubenswrapper[4762]: E0217 17:47:48.984256 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:48 crc kubenswrapper[4762]: E0217 17:47:48.984426 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.984471 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:48 crc kubenswrapper[4762]: E0217 17:47:48.984514 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.986224 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.986466 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987073 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987303 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987331 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987374 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987480 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.987723 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.988314 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 17:47:48 crc kubenswrapper[4762]: I0217 17:47:48.993980 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:52:10.092020117 +0000 UTC Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.013853 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.025843 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.042969 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.054303 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.069550 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.081700 4762 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.081942 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082070 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082098 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082119 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082135 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082154 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082173 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082211 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082238 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082263 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082268 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082287 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082313 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082343 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082375 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082406 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082416 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082431 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082493 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082517 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082538 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082555 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082577 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082596 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082612 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082649 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082673 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082698 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082718 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082734 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082753 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082769 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082779 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082786 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082833 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082857 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082885 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082874 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082913 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082945 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082968 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082992 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083028 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083066 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083095 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083140 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083165 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083192 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083213 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083236 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083258 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083278 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083298 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083322 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083344 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083371 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083391 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083410 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083432 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083451 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083492 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083532 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083558 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083577 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083598 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083618 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083664 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083689 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083713 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083734 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083755 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083777 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083797 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083820 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083875 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083902 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083967 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083995 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084026 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084057 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084085 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084113 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084143 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084173 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084201 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084226 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084250 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084272 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084293 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084314 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084336 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084357 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084380 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084402 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084429 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084454 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084476 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084499 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084519 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084541 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084564 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084587 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084611 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084653 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084676 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084703 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084724 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084747 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084771 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084794 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084817 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084839 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085081 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085106 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085128 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085151 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085172 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085195 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085221 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085253 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085286 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085321 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085352 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085420 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085454 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085485 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085510 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085534 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085572 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085596 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085644 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085668 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085782 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085806 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085831 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085853 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085877 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085901 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085929 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085961 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085984 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086011 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086040 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086064 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086088 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086110 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086133 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086158 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086183 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086213 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086238 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086261 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086303 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086327 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086373 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086405 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086444 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086485 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086530 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086568 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086605 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086659 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086687 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086712 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086734 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086779 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086802 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086823 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086868 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086892 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086913 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086936 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086959 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086998 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087021 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087046 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087074 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087098 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087123 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087149 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087172 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087194 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087218 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087240 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087264 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087287 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087318 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087353 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087389 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087425 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087456 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087481 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087530 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087909 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087934 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087972 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087995 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088021 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088168 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088303 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088321 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088337 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088351 4762 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088365 4762 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088379 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089338 4762 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.082878 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083014 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083256 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083294 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.091906 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.092523 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.092710 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.092790 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.092822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.093708 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.093808 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.093032 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083639 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083573 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083904 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.093774 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084679 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084803 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085761 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085789 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.085888 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086088 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086143 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086169 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.094100 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086241 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086410 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086438 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086480 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086667 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086691 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.086746 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087016 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087009 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087061 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.094206 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087093 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087126 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087283 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087281 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.087884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088319 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088403 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088445 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088597 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088894 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.088947 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089188 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089349 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089489 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089677 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089731 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.089991 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090047 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090043 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090153 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090510 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090820 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090850 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090915 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090951 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.090990 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.091013 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.091027 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.091396 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.091818 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.093044 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.084070 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095087 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095752 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095761 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095929 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.083306 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.095998 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096282 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096653 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096674 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096614 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096732 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096797 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.096990 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.097216 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.097430 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.097535 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.097725 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.097970 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.098723 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.098771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.098801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.098814 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099170 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099220 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.099329 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:47:49.599290533 +0000 UTC m=+21.244208573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099492 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099693 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099720 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.099787 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.100105 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.100677 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.100714 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.100740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.101132 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.101243 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.101856 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.101887 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.102246 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.102422 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.102604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.102896 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.102915 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.103653 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.103968 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.104295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.104709 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.105334 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.105438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.106925 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.107142 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:49.607099598 +0000 UTC m=+21.252017608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.107185 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.107562 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.107963 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.108316 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.108466 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.108743 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.108859 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.109037 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.109214 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.109274 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.109324 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:49.609308915 +0000 UTC m=+21.254227015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.110288 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.110654 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.110895 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.112015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.112129 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.112221 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.114911 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.126142 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.126201 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.126233 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.126345 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:49.626308339 +0000 UTC m=+21.271226389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.128251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.130969 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.131078 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.133073 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.133254 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.133956 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.134319 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.134382 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.134405 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.134504 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:49.634479845 +0000 UTC m=+21.279397955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.137925 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.139042 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.139225 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.139865 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.140095 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.140339 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.140956 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.141236 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.141493 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.141663 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.141754 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.141893 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.143157 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.143327 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.143331 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.143448 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.143771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.144050 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.145249 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.145248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.145340 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.145578 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.146025 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.146494 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.147163 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.148355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.147616 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.149554 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150068 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150103 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150413 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150587 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.150726 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.151808 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.151970 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.152324 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.153376 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.153502 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.154063 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.154588 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.155508 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.156432 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.156742 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.156817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff"} Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.156757 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.156790 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.157346 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.157383 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.157359 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.157494 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.157701 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.158226 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.158523 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.161792 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.162210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.162530 4762 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.163055 4762 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.163939 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.164189 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.165923 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.166061 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.167304 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.178201 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.181280 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.186215 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.186511 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.188349 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.189962 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190009 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190138 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190157 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190173 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190186 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190197 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190209 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190221 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190232 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190247 4762 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190263 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190279 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190310 4762 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190321 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190332 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190341 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190350 4762 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190358 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190366 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190375 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190384 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190393 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190402 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190410 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190418 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190426 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190435 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190443 4762 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190450 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190459 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190466 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190475 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190483 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190491 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190499 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190507 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190517 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190526 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190533 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190541 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190548 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190556 4762 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190564 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190572 4762 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190581 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190589 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190598 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190606 4762 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190615 4762 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190639 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190651 4762 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190661 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190669 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190678 4762 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190687 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190695 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190703 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190711 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190720 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190728 4762 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190737 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190745 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190753 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190762 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190771 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190779 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190787 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190795 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190803 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190813 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190825 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190836 4762 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190849 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190859 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190867 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190875 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190883 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190891 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190899 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190907 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190915 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190923 4762 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190930 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190939 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190946 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190954 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190962 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190970 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190977 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190986 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.190993 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191001 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191009 4762 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191017 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191024 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191032 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191040 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191048 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191056 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191064 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191072 4762 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191079 4762 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191087 4762 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191096 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191105 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191112 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191119 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191138 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191146 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191154 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191162 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191170 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191177 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191185 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191195 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191203 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191211 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191218 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191226 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191233 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191241 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191248 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191256 4762 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191264 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191272 4762 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191280 4762 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191287 4762 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191295 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191303 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191311 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191319 4762 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191326 4762 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191334 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191343 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191351 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191361 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191368 4762 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191376 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191383 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191391 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191399 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191407 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191415 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191422 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191430 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191438 4762 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191447 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191454 4762 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191462 4762 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191471 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191478 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191485 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191493 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191501 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191529 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191538 4762 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191546 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191555 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191563 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191572 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191580 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191588 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191596 4762 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191604 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191639 4762 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191648 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191656 4762 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191666 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191674 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191682 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191689 4762 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191697 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191705 4762 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191713 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191721 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191729 4762 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191736 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191744 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191753 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191762 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191771 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191779 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191787 4762 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191795 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191803 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191811 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191819 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191827 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.191834 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.193561 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.201749 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.208565 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.217196 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.224407 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.234189 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.245998 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.254845 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.263324 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.272320 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.280461 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.290457 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.300580 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.303681 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.313674 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:47:49 crc kubenswrapper[4762]: W0217 17:47:49.315410 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-4ae04f4b55c4d4268001f1cf1212731de8a0797034eb28fdc3660b501a4b7b84 WatchSource:0}: Error finding container 4ae04f4b55c4d4268001f1cf1212731de8a0797034eb28fdc3660b501a4b7b84: Status 404 returned error can't find the container with id 4ae04f4b55c4d4268001f1cf1212731de8a0797034eb28fdc3660b501a4b7b84 Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.323996 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:47:49 crc kubenswrapper[4762]: W0217 17:47:49.340393 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-142253e1925e41d6864454a932c5876447ea2de4562eb0f2191a009fe8ade77a WatchSource:0}: Error finding container 142253e1925e41d6864454a932c5876447ea2de4562eb0f2191a009fe8ade77a: Status 404 returned error can't find the container with id 142253e1925e41d6864454a932c5876447ea2de4562eb0f2191a009fe8ade77a Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.694424 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.694502 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694543 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:47:50.694526116 +0000 UTC m=+22.339444126 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.694567 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.694591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.694610 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694637 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694656 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694671 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694671 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694702 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:50.694695831 +0000 UTC m=+22.339613841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694713 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:50.694708322 +0000 UTC m=+22.339626332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694836 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694879 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694892 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694948 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:50.694930738 +0000 UTC m=+22.339848748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.694847 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: E0217 17:47:49.695100 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:50.695068872 +0000 UTC m=+22.339986872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:49 crc kubenswrapper[4762]: I0217 17:47:49.994328 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 12:06:16.510403027 +0000 UTC Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.162874 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.162946 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.162968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"142253e1925e41d6864454a932c5876447ea2de4562eb0f2191a009fe8ade77a"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.164731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9d57221eb883efc70e4840d461e1b9f6eb6e62360cb87418a136294b2401f580"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.166398 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.166444 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4ae04f4b55c4d4268001f1cf1212731de8a0797034eb28fdc3660b501a4b7b84"} Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.179174 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.197101 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.213307 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.227899 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.241342 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.256894 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.269557 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.283119 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.295290 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.309272 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.328859 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.346156 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.359071 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.377753 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.391973 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.407615 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:50Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.701666 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.701748 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.701783 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.701812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.701840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.701901 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:47:52.701869442 +0000 UTC m=+24.346787462 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.701915 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.701976 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.701975 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:52.701965665 +0000 UTC m=+24.346883695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702034 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702099 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702117 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702132 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702099 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702169 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702047 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:52.702030067 +0000 UTC m=+24.346948147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702205 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:52.702195762 +0000 UTC m=+24.347113782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:50 crc kubenswrapper[4762]: E0217 17:47:50.702219 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:52.702212212 +0000 UTC m=+24.347130232 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:50 crc kubenswrapper[4762]: I0217 17:47:50.994853 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:00:23.187689213 +0000 UTC Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.035371 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.035451 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:51 crc kubenswrapper[4762]: E0217 17:47:51.035537 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.035388 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:51 crc kubenswrapper[4762]: E0217 17:47:51.035612 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:51 crc kubenswrapper[4762]: E0217 17:47:51.035809 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.038994 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.039465 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.040666 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.041254 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.042170 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.042773 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.043422 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.044476 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.045103 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.046157 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.046614 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.047661 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.048124 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.048678 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.049556 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.050061 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.050989 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.051345 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.051873 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.052773 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.053197 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.054089 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.054485 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.055509 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.055940 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.056494 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.057506 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.058074 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.058933 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.059351 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.060135 4762 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.060233 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.061840 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.062693 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.063108 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.064480 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.065101 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.066017 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.066615 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.067567 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.068037 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.068991 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.069588 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.070591 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.071048 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.071915 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.072438 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.073485 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.073933 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.074754 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.075186 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.076060 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.076578 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.077050 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 17:47:51 crc kubenswrapper[4762]: I0217 17:47:51.996470 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 06:28:36.59918994 +0000 UTC Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.174530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254"} Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.189135 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.203035 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.214069 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.224593 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.237946 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.249650 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.260601 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.272171 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:52Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.718107 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.718202 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718287 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718295 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:47:56.718263088 +0000 UTC m=+28.363181098 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718343 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:56.71832642 +0000 UTC m=+28.363244470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.718372 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.718404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.718432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718515 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718574 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:56.718551497 +0000 UTC m=+28.363469557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718596 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718605 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718644 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718657 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718666 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718680 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718709 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:56.718698571 +0000 UTC m=+28.363616631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:52 crc kubenswrapper[4762]: E0217 17:47:52.718744 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:47:56.718720702 +0000 UTC m=+28.363638742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.853277 4762 csr.go:261] certificate signing request csr-89hbm is approved, waiting to be issued Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.862315 4762 csr.go:257] certificate signing request csr-89hbm is issued Feb 17 17:47:52 crc kubenswrapper[4762]: I0217 17:47:52.997378 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 18:30:46.244264297 +0000 UTC Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.035440 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.035466 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.035448 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:53 crc kubenswrapper[4762]: E0217 17:47:53.035588 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:53 crc kubenswrapper[4762]: E0217 17:47:53.035752 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:53 crc kubenswrapper[4762]: E0217 17:47:53.035931 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.246218 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zgv5j"] Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.246477 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.248478 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.248479 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.248496 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.261593 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.274913 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.288945 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.298760 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.318208 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.323860 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/166682c4-697f-453c-b43a-e649aaeb0c69-hosts-file\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.323905 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmvv\" (UniqueName: \"kubernetes.io/projected/166682c4-697f-453c-b43a-e649aaeb0c69-kube-api-access-rmmvv\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.333159 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.351141 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.365031 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.376972 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.425295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmvv\" (UniqueName: \"kubernetes.io/projected/166682c4-697f-453c-b43a-e649aaeb0c69-kube-api-access-rmmvv\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.425609 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/166682c4-697f-453c-b43a-e649aaeb0c69-hosts-file\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.425735 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/166682c4-697f-453c-b43a-e649aaeb0c69-hosts-file\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.451914 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmvv\" (UniqueName: \"kubernetes.io/projected/166682c4-697f-453c-b43a-e649aaeb0c69-kube-api-access-rmmvv\") pod \"node-resolver-zgv5j\" (UID: \"166682c4-697f-453c-b43a-e649aaeb0c69\") " pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.557546 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zgv5j" Feb 17 17:47:53 crc kubenswrapper[4762]: W0217 17:47:53.569219 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166682c4_697f_453c_b43a_e649aaeb0c69.slice/crio-88aa5016a07ac0134a4b7bbc54c260cb18545305410c9aad1e8801ff405c5769 WatchSource:0}: Error finding container 88aa5016a07ac0134a4b7bbc54c260cb18545305410c9aad1e8801ff405c5769: Status 404 returned error can't find the container with id 88aa5016a07ac0134a4b7bbc54c260cb18545305410c9aad1e8801ff405c5769 Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.633529 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jb9kz"] Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.633907 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.634492 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kg68g"] Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.635176 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642122 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642138 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642192 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642235 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642494 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642633 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642664 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642835 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642903 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.642907 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.645772 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-k2xfd"] Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.646009 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.647796 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.649921 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.666672 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.680185 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.692173 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.704759 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.719574 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728094 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-multus\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728141 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-etc-kubernetes\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728162 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728182 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-system-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-netns\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728220 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82z4c\" (UniqueName: \"kubernetes.io/projected/d0f706d4-18a1-44c0-8913-b46af7876ee7-kube-api-access-82z4c\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728249 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-os-release\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fmm\" (UniqueName: \"kubernetes.io/projected/132714a2-f72f-40f0-8156-33fa78780072-kube-api-access-t5fmm\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728304 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-cni-binary-copy\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-k8s-cni-cncf-io\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728347 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-system-cni-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7389b1a3-5839-49b0-97e8-2adcbe0fd491-proxy-tls\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728396 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-hostroot\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728445 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7389b1a3-5839-49b0-97e8-2adcbe0fd491-mcd-auth-proxy-config\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728547 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bmp\" (UniqueName: \"kubernetes.io/projected/7389b1a3-5839-49b0-97e8-2adcbe0fd491-kube-api-access-p5bmp\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728656 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-cnibin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-conf-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728718 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728742 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-daemon-config\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728764 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-socket-dir-parent\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728784 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-bin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-multus-certs\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-cnibin\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728861 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-kubelet\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728881 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7389b1a3-5839-49b0-97e8-2adcbe0fd491-rootfs\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728914 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-os-release\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.728943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-binary-copy\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.729402 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.740778 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.757352 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.768632 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.781926 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.793603 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.805012 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.827452 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829730 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bmp\" (UniqueName: \"kubernetes.io/projected/7389b1a3-5839-49b0-97e8-2adcbe0fd491-kube-api-access-p5bmp\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829766 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829788 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-cnibin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829807 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-conf-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829827 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-daemon-config\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829866 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-socket-dir-parent\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-bin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829903 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-multus-certs\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829918 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-cnibin\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-kubelet\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829946 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7389b1a3-5839-49b0-97e8-2adcbe0fd491-rootfs\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-conf-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.829968 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-os-release\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830046 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-binary-copy\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-multus\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830142 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-etc-kubernetes\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830200 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-system-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830222 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-netns\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830234 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82z4c\" (UniqueName: \"kubernetes.io/projected/d0f706d4-18a1-44c0-8913-b46af7876ee7-kube-api-access-82z4c\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830291 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-os-release\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fmm\" (UniqueName: \"kubernetes.io/projected/132714a2-f72f-40f0-8156-33fa78780072-kube-api-access-t5fmm\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830343 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-os-release\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830351 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-cni-binary-copy\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830391 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-k8s-cni-cncf-io\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-system-cni-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830443 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7389b1a3-5839-49b0-97e8-2adcbe0fd491-proxy-tls\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-hostroot\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7389b1a3-5839-49b0-97e8-2adcbe0fd491-mcd-auth-proxy-config\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830646 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-daemon-config\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-kubelet\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830798 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-multus-socket-dir-parent\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830832 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-bin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830860 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-multus-certs\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-cnibin\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830902 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-var-lib-cni-multus\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830942 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-os-release\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.830970 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-etc-kubernetes\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831000 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-system-cni-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831026 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7389b1a3-5839-49b0-97e8-2adcbe0fd491-rootfs\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831047 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-k8s-cni-cncf-io\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831097 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-system-cni-dir\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831142 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0f706d4-18a1-44c0-8913-b46af7876ee7-cni-binary-copy\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7389b1a3-5839-49b0-97e8-2adcbe0fd491-mcd-auth-proxy-config\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831198 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-hostroot\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831211 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-host-run-netns\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/132714a2-f72f-40f0-8156-33fa78780072-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831279 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/132714a2-f72f-40f0-8156-33fa78780072-cni-binary-copy\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.831369 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0f706d4-18a1-44c0-8913-b46af7876ee7-cnibin\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.836242 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7389b1a3-5839-49b0-97e8-2adcbe0fd491-proxy-tls\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.849895 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.855213 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bmp\" (UniqueName: \"kubernetes.io/projected/7389b1a3-5839-49b0-97e8-2adcbe0fd491-kube-api-access-p5bmp\") pod \"machine-config-daemon-jb9kz\" (UID: \"7389b1a3-5839-49b0-97e8-2adcbe0fd491\") " pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.859292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82z4c\" (UniqueName: \"kubernetes.io/projected/d0f706d4-18a1-44c0-8913-b46af7876ee7-kube-api-access-82z4c\") pod \"multus-k2xfd\" (UID: \"d0f706d4-18a1-44c0-8913-b46af7876ee7\") " pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.860049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fmm\" (UniqueName: \"kubernetes.io/projected/132714a2-f72f-40f0-8156-33fa78780072-kube-api-access-t5fmm\") pod \"multus-additional-cni-plugins-kg68g\" (UID: \"132714a2-f72f-40f0-8156-33fa78780072\") " pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.866953 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.871696 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 17:42:52 +0000 UTC, rotation deadline is 2027-01-10 00:10:55.657014074 +0000 UTC Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.871764 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7830h23m1.785252757s for next certificate rotation Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.884212 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.896384 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.907472 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.918005 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.927657 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.939384 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.949136 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:53Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.953328 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.966575 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kg68g" Feb 17 17:47:53 crc kubenswrapper[4762]: W0217 17:47:53.967052 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7389b1a3_5839_49b0_97e8_2adcbe0fd491.slice/crio-f6783fc51dbbcd3b5a99ebaf3a4f06a9a8949086efae2d5e186fd96d6ead3eae WatchSource:0}: Error finding container f6783fc51dbbcd3b5a99ebaf3a4f06a9a8949086efae2d5e186fd96d6ead3eae: Status 404 returned error can't find the container with id f6783fc51dbbcd3b5a99ebaf3a4f06a9a8949086efae2d5e186fd96d6ead3eae Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.977778 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k2xfd" Feb 17 17:47:53 crc kubenswrapper[4762]: W0217 17:47:53.989011 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132714a2_f72f_40f0_8156_33fa78780072.slice/crio-c17e2a9dae1cca3407858d2064c7de26dac1c4e8d8b63726f27251894d595922 WatchSource:0}: Error finding container c17e2a9dae1cca3407858d2064c7de26dac1c4e8d8b63726f27251894d595922: Status 404 returned error can't find the container with id c17e2a9dae1cca3407858d2064c7de26dac1c4e8d8b63726f27251894d595922 Feb 17 17:47:53 crc kubenswrapper[4762]: I0217 17:47:53.998279 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 01:25:43.247059408 +0000 UTC Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.028509 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f6zrt"] Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.029451 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.031072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032180 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032199 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032214 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032242 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032459 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.032816 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.048064 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.059881 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.079318 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.092068 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.103250 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.115149 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.124859 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134334 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134364 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134421 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmmf\" (UniqueName: \"kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134466 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134482 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134497 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134512 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134526 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134567 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134583 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134648 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134661 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.134680 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.137979 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.148938 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.161029 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.173642 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.180574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerStarted","Data":"fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.180614 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerStarted","Data":"390187b26f842a3f0240abe0e7288b43916f17be0e491fc2e3c8daa94bcad79a"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.182160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.182180 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"f6783fc51dbbcd3b5a99ebaf3a4f06a9a8949086efae2d5e186fd96d6ead3eae"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.184323 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zgv5j" event={"ID":"166682c4-697f-453c-b43a-e649aaeb0c69","Type":"ContainerStarted","Data":"f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.184375 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zgv5j" event={"ID":"166682c4-697f-453c-b43a-e649aaeb0c69","Type":"ContainerStarted","Data":"88aa5016a07ac0134a4b7bbc54c260cb18545305410c9aad1e8801ff405c5769"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.185303 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerStarted","Data":"c17e2a9dae1cca3407858d2064c7de26dac1c4e8d8b63726f27251894d595922"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.186704 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.197954 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.214517 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.228028 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235694 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235761 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmmf\" (UniqueName: \"kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235828 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235911 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235963 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.235995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236026 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236125 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236142 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236161 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236183 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236195 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236220 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236251 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236216 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236299 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236333 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236462 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236486 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236491 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236514 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236584 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236588 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236632 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236764 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236867 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.236983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.237323 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.240304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.241721 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.255645 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.258333 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmmf\" (UniqueName: \"kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf\") pod \"ovnkube-node-f6zrt\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.267316 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.279392 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.291024 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.302667 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.318743 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.329160 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.340842 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.352291 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.376332 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.377469 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: W0217 17:47:54.388776 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e901c69_4b38_4f54_9811_83bd34c46a07.slice/crio-cd2c6574ad6bea413adcb230281e117866ad87bbee89e734f3d32453093b3cc4 WatchSource:0}: Error finding container cd2c6574ad6bea413adcb230281e117866ad87bbee89e734f3d32453093b3cc4: Status 404 returned error can't find the container with id cd2c6574ad6bea413adcb230281e117866ad87bbee89e734f3d32453093b3cc4 Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.478435 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.480146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.480209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.480218 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.480338 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.490177 4762 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.490482 4762 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.491550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.491580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.491591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.491605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.491615 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.515259 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.519376 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.519418 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.519435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.519456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.519471 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.531445 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.535076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.535118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.535128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.535145 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.535156 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.546853 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.551156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.551190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.551198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.551212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.551224 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.562057 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.565265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.565304 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.565315 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.565332 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.565345 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.578499 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:54Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:54 crc kubenswrapper[4762]: E0217 17:47:54.578943 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.580929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.581052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.581154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.581249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.581335 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.684134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.684176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.684188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.684210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.684225 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.786320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.786373 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.786390 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.786411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.786428 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.889317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.889351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.889359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.889374 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.889383 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.991670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.991968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.992092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.992230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.992347 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:54Z","lastTransitionTime":"2026-02-17T17:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:54 crc kubenswrapper[4762]: I0217 17:47:54.999134 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 09:00:25.896530254 +0000 UTC Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.035691 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.035717 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:55 crc kubenswrapper[4762]: E0217 17:47:55.035872 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.035719 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:55 crc kubenswrapper[4762]: E0217 17:47:55.036048 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:55 crc kubenswrapper[4762]: E0217 17:47:55.036139 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.095413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.095726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.095739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.095756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.095769 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.191081 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.192679 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" exitCode=0 Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.192742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.192763 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"cd2c6574ad6bea413adcb230281e117866ad87bbee89e734f3d32453093b3cc4"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.194072 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53" exitCode=0 Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.194109 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.203836 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.211151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.211181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.211192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.211208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.211217 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.219220 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.234921 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.245908 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.304467 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.315356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.315394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.315405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.315420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.315431 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.316773 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.334689 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.345476 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.358222 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.368667 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.381178 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.394179 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.407224 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.417709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.417746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.417756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.417770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.417780 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.418983 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.429938 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.453069 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.466161 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.478389 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.488490 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.504770 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.516972 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.519572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.519597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.519606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.519633 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.519645 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.528769 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.540365 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.552281 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.566591 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.579024 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.621125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.621166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.621176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.621193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.621204 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.724610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.724651 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.724659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.724672 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.724690 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.834225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.834703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.834719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.834737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.834749 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.937748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.937786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.937794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.937807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:55 crc kubenswrapper[4762]: I0217 17:47:55.937817 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:55Z","lastTransitionTime":"2026-02-17T17:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.000663 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:27:25.419520009 +0000 UTC Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.040548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.040610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.040664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.040695 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.040713 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.144487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.144541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.144550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.144564 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.144573 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203043 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203094 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203126 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.203151 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.205750 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a" exitCode=0 Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.205816 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.223682 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.228073 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fzb7v"] Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.228473 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.229903 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.230692 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.231208 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.231417 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.237815 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.247081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.247120 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.247133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.247153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.247165 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.252469 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.255022 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46ea121f-8e60-4e68-af96-9c972a27988b-serviceca\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.255195 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ea121f-8e60-4e68-af96-9c972a27988b-host\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.255235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9ng\" (UniqueName: \"kubernetes.io/projected/46ea121f-8e60-4e68-af96-9c972a27988b-kube-api-access-9h9ng\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.272578 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.291068 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.303903 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.316233 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.327333 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.341224 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.349641 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.349685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.349695 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.349712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.349726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.353857 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.356392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ea121f-8e60-4e68-af96-9c972a27988b-host\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.356431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9ng\" (UniqueName: \"kubernetes.io/projected/46ea121f-8e60-4e68-af96-9c972a27988b-kube-api-access-9h9ng\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.356469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46ea121f-8e60-4e68-af96-9c972a27988b-serviceca\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.356543 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ea121f-8e60-4e68-af96-9c972a27988b-host\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.358488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46ea121f-8e60-4e68-af96-9c972a27988b-serviceca\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.367384 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.372974 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9ng\" (UniqueName: \"kubernetes.io/projected/46ea121f-8e60-4e68-af96-9c972a27988b-kube-api-access-9h9ng\") pod \"node-ca-fzb7v\" (UID: \"46ea121f-8e60-4e68-af96-9c972a27988b\") " pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.378507 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.390763 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.402728 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.414965 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.424846 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.438472 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.449577 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.452738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.452774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.452786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.452802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.452814 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.461859 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.475678 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.488456 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.498770 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.511522 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.521800 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.539098 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.542153 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fzb7v" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.555379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.555433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.555451 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.555474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.555491 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.558381 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.580016 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.657912 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.657955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.657966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.657982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.657992 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760041 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760171 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:48:04.760154463 +0000 UTC m=+36.405072473 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760454 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760571 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760459 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760695 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:04.760682609 +0000 UTC m=+36.405600639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760717 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.760748 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760761 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760834 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760835 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:04.760812393 +0000 UTC m=+36.405730413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760848 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760873 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760894 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760907 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760921 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760923 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:04.760911196 +0000 UTC m=+36.405829316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:56 crc kubenswrapper[4762]: E0217 17:47:56.760966 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:04.760956767 +0000 UTC m=+36.405874797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.862667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.862695 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.862703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.862717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.862729 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.967924 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.968229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.968264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.968356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:56 crc kubenswrapper[4762]: I0217 17:47:56.968457 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:56Z","lastTransitionTime":"2026-02-17T17:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.001484 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 11:59:49.539631178 +0000 UTC Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.035896 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.035913 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:57 crc kubenswrapper[4762]: E0217 17:47:57.036035 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:57 crc kubenswrapper[4762]: E0217 17:47:57.036110 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.035931 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:57 crc kubenswrapper[4762]: E0217 17:47:57.036180 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.070815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.070862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.070874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.070892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.070905 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.174377 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.174459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.174485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.174518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.174541 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.215281 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6" exitCode=0 Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.215412 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.218296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fzb7v" event={"ID":"46ea121f-8e60-4e68-af96-9c972a27988b","Type":"ContainerStarted","Data":"8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.218356 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fzb7v" event={"ID":"46ea121f-8e60-4e68-af96-9c972a27988b","Type":"ContainerStarted","Data":"a6996250b48db4b1d4bf820876ebf646a25d19035bc3de740b4ab909324f363a"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.253909 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.268862 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.277460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.277502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.277518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.277535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.277548 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.286224 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.300730 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.319360 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.331138 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.342802 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.355892 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.369923 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.380538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.380572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.380580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.380593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.380603 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.384701 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.397777 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.408909 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.422379 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.435002 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.446256 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.459357 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.472177 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.482306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.482350 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.482433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.482460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.482475 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.484634 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.498080 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.510896 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.522575 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.543040 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.560987 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.574195 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584605 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.584907 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.595448 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.608380 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.617322 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:57Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.687365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.687423 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.687437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.687453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.687463 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.790282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.790322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.790334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.790351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.790363 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.893841 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.893886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.893898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.893914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.893927 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.997186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.997257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.997280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.997310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:57 crc kubenswrapper[4762]: I0217 17:47:57.997338 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:57Z","lastTransitionTime":"2026-02-17T17:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.002524 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 22:56:25.780994397 +0000 UTC Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.101101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.101162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.101180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.101203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.101220 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.203802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.203875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.203898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.203927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.203949 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.229473 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d" exitCode=0 Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.229538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.246911 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.262816 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.279187 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.292734 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.307200 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.308253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.308306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.308319 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.308463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.308481 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.318815 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.332096 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.347615 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.373391 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.386530 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.399183 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.407657 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.410893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.410922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.410932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.410947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.410958 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.422286 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.434251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:58Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.514115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.514162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.514173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.514190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.514204 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.620990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.621056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.621073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.621097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.621114 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.723464 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.723498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.723507 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.723521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.723529 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.826828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.827252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.827272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.827299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.827318 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.830048 4762 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.929676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.929702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.929710 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.929723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:58 crc kubenswrapper[4762]: I0217 17:47:58.929732 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:58Z","lastTransitionTime":"2026-02-17T17:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.003241 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 20:13:15.802133221 +0000 UTC Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.031855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.031897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.031909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.031926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.031937 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.035812 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:47:59 crc kubenswrapper[4762]: E0217 17:47:59.035900 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.036139 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:47:59 crc kubenswrapper[4762]: E0217 17:47:59.036201 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.036512 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:47:59 crc kubenswrapper[4762]: E0217 17:47:59.036581 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.048344 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.063962 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.090711 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.102539 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.112536 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.122977 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.156834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.156870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.156878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.156892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.156901 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.160965 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.188730 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.211077 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.223269 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.235271 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.237970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.240302 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40" exitCode=0 Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.240326 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.245507 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.257563 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.260431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.260457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.260468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.260483 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.260494 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.270014 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.284382 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.296259 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.308200 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.320875 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.337850 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.350061 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.360525 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.363258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.363295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.363306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.363323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.363333 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.373856 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.389222 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.400473 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.416040 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.430943 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.447730 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.462057 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:47:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.465881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.465945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.465965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.465997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.466035 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.569049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.569103 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.569115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.569131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.569140 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.672327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.672413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.672438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.672471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.672494 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.775239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.775285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.775296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.775312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.775326 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.878310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.878374 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.878386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.878405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.878416 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.981797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.981849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.981861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.981884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:47:59 crc kubenswrapper[4762]: I0217 17:47:59.981896 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:47:59Z","lastTransitionTime":"2026-02-17T17:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.003348 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 11:00:31.812171386 +0000 UTC Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.084338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.084373 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.084384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.084401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.084412 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.187488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.187537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.187552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.187576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.187593 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.246871 4762 generic.go:334] "Generic (PLEG): container finished" podID="132714a2-f72f-40f0-8156-33fa78780072" containerID="a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817" exitCode=0 Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.246916 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerDied","Data":"a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.265685 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.290024 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.290132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.290145 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.290162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.290232 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.293930 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.309783 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.331064 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.348900 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.366025 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.377893 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.391333 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.393480 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.393516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.393529 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.393546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.393558 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.402151 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.412280 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.423145 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.436315 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.446613 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.458439 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.496460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.496496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.496507 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.496523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.496534 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.598471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.598505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.598517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.598533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.598544 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.701202 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.701229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.701239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.701254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.701265 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.804470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.804520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.804532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.804548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.804560 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.907405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.907461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.907476 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.907496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:00 crc kubenswrapper[4762]: I0217 17:48:00.907512 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:00Z","lastTransitionTime":"2026-02-17T17:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.003948 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:23:18.661351918 +0000 UTC Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.010747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.010806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.010825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.010849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.010866 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.035347 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.035438 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:01 crc kubenswrapper[4762]: E0217 17:48:01.035517 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:01 crc kubenswrapper[4762]: E0217 17:48:01.035731 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.035781 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:01 crc kubenswrapper[4762]: E0217 17:48:01.035889 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.113421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.113479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.113492 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.113513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.113526 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.216388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.216425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.216433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.216447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.216456 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.254787 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.255420 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.255569 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.263099 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" event={"ID":"132714a2-f72f-40f0-8156-33fa78780072","Type":"ContainerStarted","Data":"783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.274804 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.281838 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.285275 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.290009 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.297552 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.307257 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.318663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.318739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.318750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.318846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.318858 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.319086 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.333019 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.350416 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.363151 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.378604 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.393209 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.404517 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.417063 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.420497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.420527 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.420536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.420550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.420559 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.435890 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.462381 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.475752 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.487128 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.498926 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.510550 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.522077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.522125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.522136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.522153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.522166 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.527443 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.540351 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.550129 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.563308 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.574578 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.584090 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.595976 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.607166 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.617293 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.624322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.624354 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.624364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.624377 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.624386 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.628521 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.727211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.727256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.727269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.727285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.727298 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.830682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.830763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.830775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.830792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.830805 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.933212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.933302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.933330 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.933361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:01 crc kubenswrapper[4762]: I0217 17:48:01.933382 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:01Z","lastTransitionTime":"2026-02-17T17:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.004152 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 23:08:41.519794798 +0000 UTC Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.037187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.037237 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.037248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.037268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.037280 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.140139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.140188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.140200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.140217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.140231 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.243043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.243107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.243124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.243153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.243170 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.266297 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.346437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.346490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.346512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.346533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.346547 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.451251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.451294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.451306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.451323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.451334 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.558499 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.558558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.558576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.558601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.558620 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.661114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.661142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.661150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.661162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.661172 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.763027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.763066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.763076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.763092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.763101 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.865912 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.865977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.865991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.866019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.866036 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.968888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.968926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.968936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.968950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:02 crc kubenswrapper[4762]: I0217 17:48:02.968959 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:02Z","lastTransitionTime":"2026-02-17T17:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.004799 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:31:27.806755871 +0000 UTC Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.035331 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.035394 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.035405 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:03 crc kubenswrapper[4762]: E0217 17:48:03.035460 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:03 crc kubenswrapper[4762]: E0217 17:48:03.035528 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:03 crc kubenswrapper[4762]: E0217 17:48:03.035594 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.071398 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.071661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.071672 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.071686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.071695 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.174759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.174812 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.174828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.174850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.174866 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.269940 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.277100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.277178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.277199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.277224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.277245 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.380557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.380644 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.380658 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.380674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.380689 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.483485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.483570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.483584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.483601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.483612 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.586360 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.586395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.586407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.586419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.586428 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.688718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.688761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.688772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.688787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.688798 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.791023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.791076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.791126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.791151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.791167 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.893585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.893644 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.893655 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.893671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.893680 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.996113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.996149 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.996158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.996173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:03 crc kubenswrapper[4762]: I0217 17:48:03.996181 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:03Z","lastTransitionTime":"2026-02-17T17:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.005751 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:58:25.108951366 +0000 UTC Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.099071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.099128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.099142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.099165 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.099181 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.202521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.202599 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.202707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.202749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.202775 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.275528 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/0.log" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.279342 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b" exitCode=1 Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.279389 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.280408 4762 scope.go:117] "RemoveContainer" containerID="aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.301213 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.304980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.305019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.305031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.305050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.305063 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.314526 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.327989 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.340131 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.350350 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.365574 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.376727 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.388852 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.401772 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.407723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.407783 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.407796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.407813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.407824 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.415035 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.426816 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.440887 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.457854 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.486896 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:03Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0217 17:48:03.287514 6092 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287545 6092 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287580 6092 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287761 6092 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287970 6092 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.288273 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:48:03.288298 6092 factory.go:656] Stopping watch factory\\\\nI0217 17:48:03.288318 6092 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.509955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.509987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.509996 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.510009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.510018 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.612091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.612123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.612133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.612148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.612159 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.714316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.714364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.714372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.714385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.714395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.772171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.772245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.772272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.772310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.772335 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.789184 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.793033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.793064 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.793074 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.793087 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.793096 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.805551 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.809274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.809321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.809334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.809353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.809364 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.820878 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.823583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.823614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.823638 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.823651 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.823660 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.840763 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.844169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.844213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.844224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.844243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.844254 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.848112 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.848171 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.848190 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.848211 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.848230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848298 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848336 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:20.848323912 +0000 UTC m=+52.493241922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848386 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848416 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848414 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848463 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:48:20.848436105 +0000 UTC m=+52.493354125 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848427 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848492 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:20.848484167 +0000 UTC m=+52.493402297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848527 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:20.848513338 +0000 UTC m=+52.493431348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848608 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848657 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848668 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.848707 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:20.848697003 +0000 UTC m=+52.493615093 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.856559 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:04Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:04 crc kubenswrapper[4762]: E0217 17:48:04.856684 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.858463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.858500 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.858509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.858527 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.858536 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.960905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.960961 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.960979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.961006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:04 crc kubenswrapper[4762]: I0217 17:48:04.961023 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:04Z","lastTransitionTime":"2026-02-17T17:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.006722 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:05:28.997823743 +0000 UTC Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.035324 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.035382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:05 crc kubenswrapper[4762]: E0217 17:48:05.035437 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:05 crc kubenswrapper[4762]: E0217 17:48:05.035496 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.035565 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:05 crc kubenswrapper[4762]: E0217 17:48:05.035645 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.063591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.063635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.063646 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.063659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.063668 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.166485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.166524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.166533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.166550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.166559 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.269300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.269482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.269508 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.269538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.269561 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.285828 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/1.log" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.286581 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/0.log" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.294722 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85" exitCode=1 Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.294791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.294848 4762 scope.go:117] "RemoveContainer" containerID="aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.296122 4762 scope.go:117] "RemoveContainer" containerID="b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85" Feb 17 17:48:05 crc kubenswrapper[4762]: E0217 17:48:05.296456 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.321608 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.335380 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.349872 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.372563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.372663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.372692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.372756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.372782 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.375397 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.389042 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.406306 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.421761 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.443685 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.464818 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.475701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.475777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.475799 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.475825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.475842 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.487086 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.506673 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.528683 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.547976 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.578318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.578385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.578403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.578426 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.578444 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.581747 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm"] Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.582611 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.582892 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:03Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0217 17:48:03.287514 6092 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287545 6092 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287580 6092 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287761 6092 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287970 6092 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.288273 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:48:03.288298 6092 factory.go:656] Stopping watch factory\\\\nI0217 17:48:03.288318 6092 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.586170 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.586946 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.604974 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.636330 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.658105 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2z86\" (UniqueName: \"kubernetes.io/projected/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-kube-api-access-q2z86\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.658185 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.658248 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.658325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.672541 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:03Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0217 17:48:03.287514 6092 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287545 6092 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287580 6092 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287761 6092 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287970 6092 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.288273 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:48:03.288298 6092 factory.go:656] Stopping watch factory\\\\nI0217 17:48:03.288318 6092 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.681498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.681547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.681564 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.681587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.681610 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.690082 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.711022 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.724804 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.736343 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.750031 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.759270 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.759334 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2z86\" (UniqueName: \"kubernetes.io/projected/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-kube-api-access-q2z86\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.759363 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.759390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.760341 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.760498 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.762241 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.768427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.772905 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.780524 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2z86\" (UniqueName: \"kubernetes.io/projected/c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8-kube-api-access-q2z86\") pod \"ovnkube-control-plane-749d76644c-lwrpm\" (UID: \"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.783607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.783662 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.783674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.783692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.783703 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.786108 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.797965 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.810150 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.825666 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.847080 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:05Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.885819 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.885856 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.885865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.885878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.885887 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.902218 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.993467 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.993528 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.993544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.993567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:05 crc kubenswrapper[4762]: I0217 17:48:05.993588 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:05Z","lastTransitionTime":"2026-02-17T17:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.006825 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 14:05:03.437969056 +0000 UTC Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.096839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.096899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.096922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.097088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.097116 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.200438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.200468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.200481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.200494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.200503 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" event={"ID":"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8","Type":"ContainerStarted","Data":"fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302579 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" event={"ID":"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8","Type":"ContainerStarted","Data":"f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302601 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" event={"ID":"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8","Type":"ContainerStarted","Data":"8ba077ff03912cac1fbe7d8599851f053bb6a07d3f8baa0ed2f23143f75a0486"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.302708 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.306088 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/1.log" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.327209 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7844c802b9d45d4fc747c6d5c970faf5ffd8a3933170debfd7d1da0b6c7c7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:03Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0217 17:48:03.287514 6092 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287545 6092 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287580 6092 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287761 6092 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.287970 6092 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 17:48:03.288273 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:48:03.288298 6092 factory.go:656] Stopping watch factory\\\\nI0217 17:48:03.288318 6092 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.344043 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.369369 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.387026 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.402442 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.406431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.406471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.406488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.406509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.406521 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.414384 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.427487 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.428412 4762 scope.go:117] "RemoveContainer" containerID="b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85" Feb 17 17:48:06 crc kubenswrapper[4762]: E0217 17:48:06.428592 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.429929 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.444317 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.455967 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.469919 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.481352 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.498333 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.509569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.509606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.509635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.509654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.509669 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.513188 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.525457 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.539140 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.552859 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.568299 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.581950 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.598788 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.612099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.612147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.612160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.612179 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.612194 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.615075 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.634324 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.690396 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.706159 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.714204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.714267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.714279 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.714295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.714307 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.718873 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.728575 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.741573 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.753108 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.763119 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.774578 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.787854 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.816435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.816467 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.816477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.816493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.816505 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.853866 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.866225 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.879702 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.894118 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.911721 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.919199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.919256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.919273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.919295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.919313 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:06Z","lastTransitionTime":"2026-02-17T17:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.937109 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.951594 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.965658 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.977618 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:06 crc kubenswrapper[4762]: I0217 17:48:06.997839 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.007612 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 15:14:09.950887866 +0000 UTC Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.010257 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.021602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.021669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.021686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.021734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.021750 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.028787 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.034875 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.034964 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.035152 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.035140 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.035300 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.035397 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.047472 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.062072 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.075130 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.090650 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.124535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.124591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.124608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.124665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.124697 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.228018 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.228065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.228078 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.228096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.228108 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.330775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.330837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.330852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.330872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.330883 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.434547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.434602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.434614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.434661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.434674 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.475173 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wdzt7"] Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.476063 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.476171 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.498480 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.512287 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.526417 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.537443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.537487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.537502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.537523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.537535 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.543991 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.557042 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.572245 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.587601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.587651 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtx9h\" (UniqueName: \"kubernetes.io/projected/6bb87d75-4230-44b9-8ee8-7aff6d051904-kube-api-access-xtx9h\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.589538 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.600161 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.612876 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.629975 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.639870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.639907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.639920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.639937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.639947 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.653408 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.667550 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.682654 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.688882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.688976 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtx9h\" (UniqueName: \"kubernetes.io/projected/6bb87d75-4230-44b9-8ee8-7aff6d051904-kube-api-access-xtx9h\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.689054 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:07 crc kubenswrapper[4762]: E0217 17:48:07.689126 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:08.189108136 +0000 UTC m=+39.834026136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.697007 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.709163 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.710172 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtx9h\" (UniqueName: \"kubernetes.io/projected/6bb87d75-4230-44b9-8ee8-7aff6d051904-kube-api-access-xtx9h\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.725865 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.741864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.741895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.741905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.741921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.741933 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.844605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.844905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.845023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.845109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.845219 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.948679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.948739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.948760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.948785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:07 crc kubenswrapper[4762]: I0217 17:48:07.948805 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:07Z","lastTransitionTime":"2026-02-17T17:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.008301 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:51:07.396512714 +0000 UTC Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.051302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.051348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.051362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.051382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.051395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.153669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.153722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.153737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.153758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.153774 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.193358 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:08 crc kubenswrapper[4762]: E0217 17:48:08.193504 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:08 crc kubenswrapper[4762]: E0217 17:48:08.193565 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:09.193547096 +0000 UTC m=+40.838465106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.256443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.256489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.256501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.256517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.256532 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.359903 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.359957 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.359974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.359990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.360002 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.462179 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.462208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.462216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.462229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.462238 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.565152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.565190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.565199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.565228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.565241 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.667864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.667945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.667965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.667994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.668019 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.770990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.771110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.771130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.771158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.771176 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.873615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.873688 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.873701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.873719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.873730 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.975949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.975986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.975996 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.976011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:08 crc kubenswrapper[4762]: I0217 17:48:08.976022 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:08Z","lastTransitionTime":"2026-02-17T17:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.008475 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:17:28.853004196 +0000 UTC Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.035039 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.035168 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.035458 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.035525 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.035563 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.035615 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.035983 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.036119 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.057989 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.075328 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.078244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.078458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.078740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.078904 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.079059 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.090363 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.105562 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.122000 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.143970 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.156911 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.171115 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.182198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.182222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.182230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.182243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.182252 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.183373 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.194435 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.203228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.203439 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:09 crc kubenswrapper[4762]: E0217 17:48:09.203689 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:11.203665816 +0000 UTC m=+42.848583826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.212243 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.225060 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.237325 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.247880 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.264041 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.274687 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.284378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.284546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.284612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.284701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.284778 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.387634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.387907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.388006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.388101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.388165 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.490785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.490872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.490899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.490929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.490952 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.593588 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.593704 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.593732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.593761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.593781 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.696008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.696042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.696051 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.696063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.696072 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.799084 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.799128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.799137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.799151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.799162 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.902526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.902573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.902583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.902603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:09 crc kubenswrapper[4762]: I0217 17:48:09.902612 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:09Z","lastTransitionTime":"2026-02-17T17:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.005616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.005913 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.005947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.005978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.006001 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.009098 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 21:36:30.212067195 +0000 UTC Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.108751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.108792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.108801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.108816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.108825 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.211175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.211218 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.211234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.211254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.211266 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.314196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.314246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.314258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.314276 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.314289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.417366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.417424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.417440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.417460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.417477 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.520347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.520433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.520469 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.520509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.520532 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.623236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.623284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.623296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.623311 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.623323 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.726599 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.726721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.726746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.726769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.726787 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.828898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.828958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.828978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.828998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.829012 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.931350 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.931395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.931406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.931425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:10 crc kubenswrapper[4762]: I0217 17:48:10.931439 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:10Z","lastTransitionTime":"2026-02-17T17:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.009863 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:49:01.515487857 +0000 UTC Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034229 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034866 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.034950 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.034966 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.035023 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.035072 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.035027 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.035225 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.035352 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.137182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.137235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.137248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.137266 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.137278 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.226154 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.226382 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:11 crc kubenswrapper[4762]: E0217 17:48:11.226667 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:15.226603945 +0000 UTC m=+46.871521955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.239387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.239644 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.239767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.239891 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.239971 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.341988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.342045 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.342060 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.342081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.342102 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.445337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.445843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.445982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.446080 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.446188 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.548667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.548895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.548957 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.549057 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.549122 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.652011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.652055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.652063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.652077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.652086 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.755370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.755533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.755569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.755602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.755660 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.859022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.859071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.859085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.859105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.859119 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.962485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.962553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.962576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.962607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:11 crc kubenswrapper[4762]: I0217 17:48:11.962662 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:11Z","lastTransitionTime":"2026-02-17T17:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.010286 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 18:04:42.427882309 +0000 UTC Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.065915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.065977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.065990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.066007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.066018 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.168855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.169388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.169420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.169447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.169467 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.272892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.272968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.272987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.273018 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.273038 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.376369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.376423 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.376435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.376452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.376462 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.478664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.478709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.478726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.478746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.478760 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.581128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.581182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.581195 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.581213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.581226 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.684070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.684107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.684115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.684127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.684136 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.786789 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.786838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.786853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.786874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.786889 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.889353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.889425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.889434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.889448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.889457 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.992028 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.992073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.992090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.992104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:12 crc kubenswrapper[4762]: I0217 17:48:12.992113 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:12Z","lastTransitionTime":"2026-02-17T17:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.011303 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 07:39:44.129712223 +0000 UTC Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.035859 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:13 crc kubenswrapper[4762]: E0217 17:48:13.036091 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.036886 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.036969 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:13 crc kubenswrapper[4762]: E0217 17:48:13.037040 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:13 crc kubenswrapper[4762]: E0217 17:48:13.037138 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.037159 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:13 crc kubenswrapper[4762]: E0217 17:48:13.037310 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.095140 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.095179 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.095192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.095209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.095221 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.198028 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.198089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.198106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.198129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.198148 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.300357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.300426 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.300448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.300482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.300518 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.402828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.402889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.402906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.402930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.402950 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.505538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.505621 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.505735 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.505761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.505778 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.609071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.609116 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.609125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.609139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.609147 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.711734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.711786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.711803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.711823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.711835 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.814166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.814203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.814213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.814228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.814239 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.916524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.916562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.916572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.916585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:13 crc kubenswrapper[4762]: I0217 17:48:13.916595 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:13Z","lastTransitionTime":"2026-02-17T17:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.011836 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 19:59:25.502633494 +0000 UTC Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.019453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.019523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.019541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.019567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.019584 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.122797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.122831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.122845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.122862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.122874 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.225962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.226071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.226090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.226114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.226126 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.329071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.329119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.329136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.329155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.329168 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.687285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.687532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.687642 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.687780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.687923 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.790932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.791326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.791459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.791617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.791856 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.894766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.894804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.894818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.894835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.894846 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.997308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.997604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.997771 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.997884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:14 crc kubenswrapper[4762]: I0217 17:48:14.998065 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:14Z","lastTransitionTime":"2026-02-17T17:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.012088 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:40:50.863893925 +0000 UTC Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.033893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.034070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.034132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.034249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.034316 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.035182 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.035289 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.035562 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.035613 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.035701 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.035758 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.035798 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.035836 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.046460 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.050120 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.050154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.050166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.050182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.050192 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.060889 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.064090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.064132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.064144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.064160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.064170 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.076304 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.079432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.079470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.079478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.079493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.079502 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.091439 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.094565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.094606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.094625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.094660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.094673 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.106004 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.106106 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.108457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.108502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.108512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.108526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.108535 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.210775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.210811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.210819 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.210834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.210843 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.292143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.292240 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:15 crc kubenswrapper[4762]: E0217 17:48:15.292541 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:23.292456963 +0000 UTC m=+54.937374983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.313039 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.313100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.313119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.313142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.313160 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.416534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.416574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.416587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.416607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.416655 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.518725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.518766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.518777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.518794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.518806 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.621326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.621494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.621523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.621597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.621649 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.724863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.724977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.724994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.725019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.725037 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.827980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.828031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.828041 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.828060 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.828071 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.930893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.931226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.931361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.931489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:15 crc kubenswrapper[4762]: I0217 17:48:15.931620 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:15Z","lastTransitionTime":"2026-02-17T17:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.012917 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 13:53:09.528929874 +0000 UTC Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.034616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.034745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.034772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.034804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.034827 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.137740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.137777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.137818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.137836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.137846 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.241581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.241662 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.241681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.241703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.241718 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.344788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.344998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.345106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.345194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.345272 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.448517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.448820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.448928 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.449066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.449167 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.552057 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.552130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.552150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.552220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.552240 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.655813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.655871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.655888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.655911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.655928 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.759693 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.759764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.759787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.759817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.759840 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.863282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.863821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.863917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.864000 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.864080 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.967063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.967392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.967509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.967589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:16 crc kubenswrapper[4762]: I0217 17:48:16.967694 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:16Z","lastTransitionTime":"2026-02-17T17:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.013473 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 07:06:06.227979297 +0000 UTC Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.035480 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.035517 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:17 crc kubenswrapper[4762]: E0217 17:48:17.035607 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:17 crc kubenswrapper[4762]: E0217 17:48:17.035769 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.035847 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.035900 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:17 crc kubenswrapper[4762]: E0217 17:48:17.035930 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:17 crc kubenswrapper[4762]: E0217 17:48:17.036043 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.070183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.070221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.070233 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.070253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.070269 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.173097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.173139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.173152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.173170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.173182 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.235439 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.245375 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.258823 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.274535 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.276462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.276485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.276494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.276507 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.276516 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.285970 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.298489 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.314208 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.329064 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.361862 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.375756 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.378366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.378415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.378429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.378447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.378458 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.389336 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.400656 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.414517 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.423741 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.434960 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.448557 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.461646 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.473598 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.481163 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.481198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.481211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.481228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.481240 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.583722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.584067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.584333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.584598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.584913 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.687865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.687920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.687939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.687963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.687980 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.791223 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.791270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.791282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.791301 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.791311 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.894360 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.894408 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.894420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.894440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.894457 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.997836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.997904 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.997927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.997955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:17 crc kubenswrapper[4762]: I0217 17:48:17.997975 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:17Z","lastTransitionTime":"2026-02-17T17:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.014112 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 03:03:42.047020518 +0000 UTC Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.101522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.101573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.101582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.101603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.101619 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.204862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.204925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.204942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.204971 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.204990 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.307583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.307979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.308133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.308284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.308422 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.410608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.410704 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.410715 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.410736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.410757 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.513369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.513435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.513452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.513470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.513484 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.616163 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.616213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.616225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.616242 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.616252 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.719517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.719586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.719601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.719644 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.719664 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.822306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.822352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.822362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.822387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.822401 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.925817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.925871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.925881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.925895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:18 crc kubenswrapper[4762]: I0217 17:48:18.925904 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:18Z","lastTransitionTime":"2026-02-17T17:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.014678 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:06:48.759733514 +0000 UTC Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.028123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.028159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.028167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.028182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.028194 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.035449 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:19 crc kubenswrapper[4762]: E0217 17:48:19.035702 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.036119 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:19 crc kubenswrapper[4762]: E0217 17:48:19.036230 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.036303 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:19 crc kubenswrapper[4762]: E0217 17:48:19.036432 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.036500 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:19 crc kubenswrapper[4762]: E0217 17:48:19.036574 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.057243 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.081279 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.101539 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.117153 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.130123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.130166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.130182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.130204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.130220 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.133491 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.149683 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.165050 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.176444 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.188831 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.203734 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.215358 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.227820 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.232489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.232555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.232576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.232614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.232671 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.238601 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.252388 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.261212 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.274033 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.283717 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.334515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.334548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.334559 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.334572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.334581 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.436720 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.436788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.436807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.436831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.436850 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.539320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.539405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.539420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.539444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.539458 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.642491 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.642526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.642536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.642550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.642561 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.746050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.746086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.746096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.746113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.746125 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.849054 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.849124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.849139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.849160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.849176 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.952698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.952765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.952786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.952815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:19 crc kubenswrapper[4762]: I0217 17:48:19.952832 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:19Z","lastTransitionTime":"2026-02-17T17:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.015775 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 16:14:09.261203681 +0000 UTC Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.055413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.055471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.055486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.055515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.055532 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.159118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.159187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.159206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.159233 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.159253 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.262007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.262069 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.262082 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.262102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.262122 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.365193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.365251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.365266 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.365283 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.365295 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.467944 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.468019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.468041 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.468076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.468100 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.571558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.571683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.571708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.571726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.571736 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.674280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.674310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.674318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.674333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.674345 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.777881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.777981 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.778003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.778033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.778051 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.850259 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850380 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:48:52.850355576 +0000 UTC m=+84.495273596 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.850420 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.850455 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.850499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.850528 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850565 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850580 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850590 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850606 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850652 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:52.850624284 +0000 UTC m=+84.495542294 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850654 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850691 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850711 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850668 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:52.850658665 +0000 UTC m=+84.495576695 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850786 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:52.850768588 +0000 UTC m=+84.495686638 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850817 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: E0217 17:48:20.850968 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:52.850936963 +0000 UTC m=+84.495855013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.881066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.881155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.881168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.881181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.881189 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.984342 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.984387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.984401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.984423 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:20 crc kubenswrapper[4762]: I0217 17:48:20.984439 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:20Z","lastTransitionTime":"2026-02-17T17:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.016527 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 09:28:09.459613796 +0000 UTC Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.035217 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:21 crc kubenswrapper[4762]: E0217 17:48:21.035410 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.035510 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.035959 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.036014 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:21 crc kubenswrapper[4762]: E0217 17:48:21.036242 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:21 crc kubenswrapper[4762]: E0217 17:48:21.036294 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:21 crc kubenswrapper[4762]: E0217 17:48:21.036361 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.036797 4762 scope.go:117] "RemoveContainer" containerID="b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.086407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.086449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.086460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.086477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.086490 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.189197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.190133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.190291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.190411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.190535 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.293371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.293458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.293484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.293511 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.293531 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.359531 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/1.log" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.362062 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.362434 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.378831 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.396293 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.396355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.396371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.396392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.396410 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.402392 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.416679 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.440118 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.456324 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.477582 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.498677 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.499019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.499042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.499050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.499062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.499071 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.512398 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.524479 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.536471 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.554159 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.566810 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.578406 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.586266 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.600307 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.601529 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.601567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.601579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.601598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.601611 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.612188 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.629046 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.703572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.703654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.703669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.703687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.703700 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.806349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.806386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.806395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.806408 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.806417 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.908895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.908934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.908945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.908961 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:21 crc kubenswrapper[4762]: I0217 17:48:21.908974 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:21Z","lastTransitionTime":"2026-02-17T17:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.012043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.012128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.012151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.012184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.012206 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.017178 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 06:20:18.299918863 +0000 UTC Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.115420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.115466 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.115481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.115502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.115519 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.218420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.218487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.218513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.218542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.218565 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.320751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.320800 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.320811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.320828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.320841 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.367041 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/2.log" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.367664 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/1.log" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.370930 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" exitCode=1 Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.370970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.371005 4762 scope.go:117] "RemoveContainer" containerID="b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.372227 4762 scope.go:117] "RemoveContainer" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" Feb 17 17:48:22 crc kubenswrapper[4762]: E0217 17:48:22.372536 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.387900 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.398973 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.413596 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.423707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.423754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.423765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.423784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.423795 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.424803 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.441521 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.457748 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.473746 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.484509 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.495756 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.507730 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.523851 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.525691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.525725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.525734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.525748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.525757 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.539268 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.552852 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.573777 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.592211 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.617193 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e1845d859aca1df41eca363f0ef38de4e378a7182ae4c1cf3ceaefd7875a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 17:48:05.114180 6225 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0217 17:48:05.114150 6225 services_controller.go:451] Built service openshift-machine-api/cluster-autoscaler-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Tem\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.628037 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.628112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.628138 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.628173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.628199 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.636542 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.732028 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.732111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.732132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.732159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.732177 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.834723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.834793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.834815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.834836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.834848 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.937546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.937604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.937616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.937657 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:22 crc kubenswrapper[4762]: I0217 17:48:22.937670 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:22Z","lastTransitionTime":"2026-02-17T17:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.017349 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:32:44.789664007 +0000 UTC Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.035864 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.035905 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.035958 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.035888 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.036037 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.036129 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.036191 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.036388 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.040286 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.040321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.040339 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.040367 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.040381 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.143104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.143343 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.143444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.143523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.143586 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.247013 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.247054 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.247062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.247076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.247085 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.350160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.350213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.350225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.350241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.350253 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.374941 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.375064 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.375149 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:48:39.375132899 +0000 UTC m=+71.020050909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.376437 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/2.log" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.381483 4762 scope.go:117] "RemoveContainer" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" Feb 17 17:48:23 crc kubenswrapper[4762]: E0217 17:48:23.381870 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.398315 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.411922 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.423759 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.434252 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.446427 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.452647 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.452668 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.452678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.452692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.452705 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.461949 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.486841 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.501838 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.520577 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.535226 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.552108 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.554866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.554899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.554907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.554920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.554930 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.564101 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.579876 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.592526 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.604734 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.616360 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.626146 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:23Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.657331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.657373 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.657381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.657395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.657405 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.760225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.760269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.760280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.760297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.760308 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.862716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.862758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.862767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.862781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.862790 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.965160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.965209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.965222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.965244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:23 crc kubenswrapper[4762]: I0217 17:48:23.965256 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:23Z","lastTransitionTime":"2026-02-17T17:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.017782 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:40:10.678714135 +0000 UTC Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.068296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.068352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.068365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.068385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.068398 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.172180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.172240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.172249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.172270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.172283 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.275575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.275651 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.275662 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.275681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.275690 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.378165 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.378227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.378255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.378284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.378305 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.481517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.481570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.481586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.481605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.481620 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.584215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.584292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.584317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.584347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.584369 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.687292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.687337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.687349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.687364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.687374 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.795158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.795201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.795213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.795227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.795237 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.897708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.897773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.897796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.897824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:24 crc kubenswrapper[4762]: I0217 17:48:24.897847 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:24Z","lastTransitionTime":"2026-02-17T17:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.000810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.000896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.000918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.001280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.001491 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.018322 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:36:05.016401681 +0000 UTC Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.035016 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.035069 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.035179 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.035026 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.035345 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.035437 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.035484 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.035736 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.104068 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.104117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.104125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.104140 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.104149 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.192345 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.192404 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.192416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.192432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.192443 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.203762 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.207555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.207595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.207607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.207644 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.207658 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.220031 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.224034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.224077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.224090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.224107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.224119 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.243173 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.247429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.247503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.247521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.247545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.247562 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.265256 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.270385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.270419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.270434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.270458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.270473 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.285028 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:25 crc kubenswrapper[4762]: E0217 17:48:25.285137 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.287017 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.287062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.287073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.287090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.287104 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.389048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.389102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.389113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.389131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.389143 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.492193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.492253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.492265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.492281 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.492293 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.594842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.594895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.594911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.594931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.594941 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.696709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.696751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.696765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.696782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.696796 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.799659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.799696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.799714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.799732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.799743 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.902191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.902261 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.902284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.902312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:25 crc kubenswrapper[4762]: I0217 17:48:25.902331 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:25Z","lastTransitionTime":"2026-02-17T17:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.005853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.005910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.005928 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.005958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.005978 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.019356 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 16:11:49.377150232 +0000 UTC Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.108267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.108307 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.108321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.108337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.108349 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.211888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.211963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.211989 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.212021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.212073 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.315204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.315270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.315292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.315319 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.315344 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.418588 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.418669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.418686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.418709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.418726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.521606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.521962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.522093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.522219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.522332 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.625311 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.625353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.625363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.625381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.625390 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.728481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.728556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.728590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.728684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.728709 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.832610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.832737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.832759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.832786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.832807 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.935948 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.936025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.936049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.936079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:26 crc kubenswrapper[4762]: I0217 17:48:26.936103 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:26Z","lastTransitionTime":"2026-02-17T17:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.020107 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:36:47.411038371 +0000 UTC Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.035578 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.035590 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.035702 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.035733 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:27 crc kubenswrapper[4762]: E0217 17:48:27.035914 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:27 crc kubenswrapper[4762]: E0217 17:48:27.036073 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:27 crc kubenswrapper[4762]: E0217 17:48:27.036235 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:27 crc kubenswrapper[4762]: E0217 17:48:27.036474 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.038647 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.038714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.038733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.038750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.038761 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.141886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.141941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.141954 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.141972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.141995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.244870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.244919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.244933 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.244952 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.244964 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.348327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.348379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.348395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.348419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.348435 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.451922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.452016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.452033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.452056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.452072 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.555762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.555823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.555835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.555855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.555870 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.659324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.659372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.659384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.659401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.659411 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.762167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.762205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.762214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.762229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.762238 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.865196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.865269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.865294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.865328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.865367 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.967834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.967893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.967910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.967928 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:27 crc kubenswrapper[4762]: I0217 17:48:27.967947 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:27Z","lastTransitionTime":"2026-02-17T17:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.020443 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 19:16:45.526710059 +0000 UTC Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.069948 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.069979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.069988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.070002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.070011 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.172814 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.172858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.172870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.172886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.172898 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.275086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.275120 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.275128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.275141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.275151 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.378264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.378312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.378320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.378334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.378343 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.481766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.481805 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.481813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.481832 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.481842 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.585580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.585648 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.585660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.585675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.585686 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.688137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.688192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.688209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.688231 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.688249 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.791074 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.791108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.791118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.791133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.791142 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.893872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.893914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.893926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.893943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.893956 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.997093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.997512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.997702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.997845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:28 crc kubenswrapper[4762]: I0217 17:48:28.997978 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:28Z","lastTransitionTime":"2026-02-17T17:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.021358 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 02:42:50.753916962 +0000 UTC Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.034951 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:29 crc kubenswrapper[4762]: E0217 17:48:29.035135 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.035405 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:29 crc kubenswrapper[4762]: E0217 17:48:29.035501 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.035739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:29 crc kubenswrapper[4762]: E0217 17:48:29.036048 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.035853 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:29 crc kubenswrapper[4762]: E0217 17:48:29.036437 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.056233 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.073991 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.088084 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.101204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.101241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.101250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.101264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.101274 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.102087 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.116295 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.130566 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.140795 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.151617 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.161973 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.175110 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.193219 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.204521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.204563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.204574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.204590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.204600 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.213294 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.228876 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.239027 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.254861 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.266646 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.275545 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.307798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.307835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.307845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.307859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.307869 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.409846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.409889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.409899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.409915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.409928 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.511591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.511653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.511665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.511680 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.511691 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.615002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.615070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.615084 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.615101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.615114 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.718369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.718919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.718939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.718963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.718981 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.821121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.821187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.821199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.821221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.821234 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.924352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.924407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.924421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.924441 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:29 crc kubenswrapper[4762]: I0217 17:48:29.924455 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:29Z","lastTransitionTime":"2026-02-17T17:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.022007 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:27:28.047506578 +0000 UTC Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.027678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.027730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.027742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.027759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.027771 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.131867 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.131926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.131943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.131967 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.131985 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.235702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.235756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.235772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.235792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.235808 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.339196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.339275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.339297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.339325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.339351 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.442066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.442116 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.442146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.442162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.442172 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.544580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.544631 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.544641 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.544653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.544663 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.646875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.646938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.646954 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.646973 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.646987 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.750501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.750534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.750542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.750557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.750566 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.854379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.854446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.854470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.854533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.854560 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.958446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.958504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.958522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.958546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:30 crc kubenswrapper[4762]: I0217 17:48:30.958567 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:30Z","lastTransitionTime":"2026-02-17T17:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.022832 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:57:03.797627732 +0000 UTC Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.035236 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.035274 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.035345 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.035236 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:31 crc kubenswrapper[4762]: E0217 17:48:31.035446 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:31 crc kubenswrapper[4762]: E0217 17:48:31.035636 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:31 crc kubenswrapper[4762]: E0217 17:48:31.035788 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:31 crc kubenswrapper[4762]: E0217 17:48:31.035954 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.061778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.061840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.061861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.061887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.061904 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.164271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.164300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.164330 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.164343 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.164353 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.267575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.267669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.267682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.267719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.267732 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.370542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.370578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.370589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.370606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.370617 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.473036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.473074 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.473098 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.473119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.473134 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.576017 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.576088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.576112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.576141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.576159 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.678174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.678203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.678211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.678224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.678233 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.789324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.789369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.789381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.789404 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.789415 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.892155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.892183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.892191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.892204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.892214 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.995807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.995856 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.995867 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.996144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:31 crc kubenswrapper[4762]: I0217 17:48:31.996178 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:31Z","lastTransitionTime":"2026-02-17T17:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.023293 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:21:48.470378428 +0000 UTC Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.099250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.099317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.099334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.099361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.099379 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.202526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.202596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.202616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.202683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.202702 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.306358 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.306457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.306489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.306538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.306576 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.410941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.410984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.410994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.411009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.411020 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.514023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.514088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.514112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.514133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.514148 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.617264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.617307 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.617323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.617342 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.617356 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.719749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.719781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.719789 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.719802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.719811 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.822129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.822177 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.822189 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.822205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.822218 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.925079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.925124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.925135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.925153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:32 crc kubenswrapper[4762]: I0217 17:48:32.925185 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:32Z","lastTransitionTime":"2026-02-17T17:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.023849 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 11:39:23.618050647 +0000 UTC Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.028045 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.028079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.028090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.028107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.028118 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.035664 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.035675 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.035741 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:33 crc kubenswrapper[4762]: E0217 17:48:33.035778 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.035791 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:33 crc kubenswrapper[4762]: E0217 17:48:33.035844 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:33 crc kubenswrapper[4762]: E0217 17:48:33.035873 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:33 crc kubenswrapper[4762]: E0217 17:48:33.036075 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.131048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.131118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.131143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.131173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.131194 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.234192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.234240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.234265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.234277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.234284 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.337068 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.337115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.337126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.337140 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.337150 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.438823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.438858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.438867 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.438880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.438889 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.541102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.541169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.541180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.541202 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.541213 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.644021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.644101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.644119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.644154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.644179 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.746211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.746286 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.746297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.746323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.746342 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.849453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.849509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.849518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.849536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.849546 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.951927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.951983 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.951995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.952014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:33 crc kubenswrapper[4762]: I0217 17:48:33.952026 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:33Z","lastTransitionTime":"2026-02-17T17:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.024992 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:51:57.865557849 +0000 UTC Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.036755 4762 scope.go:117] "RemoveContainer" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" Feb 17 17:48:34 crc kubenswrapper[4762]: E0217 17:48:34.037245 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.054130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.054184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.054193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.054206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.054215 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.157104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.157161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.157171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.157194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.157207 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.259556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.259598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.259607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.259666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.259678 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.361927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.361970 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.361980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.361996 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.362006 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.464734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.464772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.464780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.464794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.464802 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.566943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.566986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.566994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.567008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.567017 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.668921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.668965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.668972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.668987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.668995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.771600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.771657 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.771666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.771682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.771693 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.874473 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.874518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.874526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.874540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.874552 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.977420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.977466 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.977478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.977497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:34 crc kubenswrapper[4762]: I0217 17:48:34.977508 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:34Z","lastTransitionTime":"2026-02-17T17:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.025679 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:02:33.641013001 +0000 UTC Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.035134 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.035205 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.035310 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.035726 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.035799 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.035839 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.035960 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.036117 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.079612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.079704 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.079729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.079760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.079783 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.182731 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.182777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.182788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.182806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.182820 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.284957 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.285004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.285017 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.285032 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.285043 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.388592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.388699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.388714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.388734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.388745 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.487137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.487201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.487214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.487230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.487242 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.505905 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.509573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.509608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.509637 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.509656 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.509672 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.523667 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.527872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.527922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.527935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.527953 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.527965 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.544863 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.548463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.548503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.548522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.548543 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.548559 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.564022 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.567836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.567883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.567893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.567905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.567914 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.584768 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:35 crc kubenswrapper[4762]: E0217 17:48:35.584903 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.586498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.586523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.586531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.586544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.586554 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.689411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.689452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.689461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.689509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.689523 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.792993 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.793033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.793043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.793063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.793073 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.895421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.895496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.895516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.895536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.895548 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.998864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.998926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.998949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.998979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:35 crc kubenswrapper[4762]: I0217 17:48:35.999002 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:35Z","lastTransitionTime":"2026-02-17T17:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.026386 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:13:40.552908343 +0000 UTC Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.102091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.102142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.102156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.102180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.102193 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.205151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.205189 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.205200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.205215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.205225 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.307729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.308009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.308102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.308178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.308238 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.410993 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.411243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.411387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.411483 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.411563 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.515172 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.515227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.515239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.515264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.515278 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.618057 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.618147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.618168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.618192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.618210 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.720807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.720870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.720887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.720915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.720941 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.823714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.823777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.823795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.823819 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.823836 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.926616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.926694 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.926707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.926726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:36 crc kubenswrapper[4762]: I0217 17:48:36.926742 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:36Z","lastTransitionTime":"2026-02-17T17:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.026791 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 15:39:46.26749379 +0000 UTC Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.029250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.029415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.029547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.029750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.029876 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.035604 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.035658 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.035615 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.035604 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:37 crc kubenswrapper[4762]: E0217 17:48:37.035728 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:37 crc kubenswrapper[4762]: E0217 17:48:37.035850 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:37 crc kubenswrapper[4762]: E0217 17:48:37.035884 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:37 crc kubenswrapper[4762]: E0217 17:48:37.035928 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.132262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.132317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.132333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.132354 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.132370 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.234493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.234549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.234562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.234579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.234591 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.337480 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.337824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.337963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.338115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.338256 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.440436 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.440487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.440502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.440522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.440535 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.543297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.543635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.543747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.543837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.543937 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.646471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.646516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.646526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.646541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.646550 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.748723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.748757 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.748786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.748801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.748810 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.852193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.852226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.852236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.852250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.852258 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.955000 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.955387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.955537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.955744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:37 crc kubenswrapper[4762]: I0217 17:48:37.955905 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:37Z","lastTransitionTime":"2026-02-17T17:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.027723 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 12:20:12.036990108 +0000 UTC Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.060209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.060245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.060255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.060271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.060282 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.162650 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.162689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.162698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.162728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.162738 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.265349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.265453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.265513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.265542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.265600 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.367525 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.367567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.367578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.367598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.367608 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.470147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.470189 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.470197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.470215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.470226 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.572444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.572482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.572493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.572510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.572524 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.674468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.674509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.674518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.674536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.674545 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.776403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.776448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.776463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.776480 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.776490 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.878830 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.878869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.878881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.878901 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.878914 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.981801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.981848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.981860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.981875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:38 crc kubenswrapper[4762]: I0217 17:48:38.981886 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:38Z","lastTransitionTime":"2026-02-17T17:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.028417 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 06:44:08.067862808 +0000 UTC Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.035956 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.036043 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.036157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.036232 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.036270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.036328 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.036460 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.036566 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.049596 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.058214 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.070086 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.079710 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.084127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.084251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.084359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.084448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.084522 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.091316 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.101118 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.123140 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.140439 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.151130 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.162400 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.174040 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.183913 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.186356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.186410 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.186424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.186443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.186454 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.197195 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.211942 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.223782 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.233776 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.246606 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.289182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.289221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.289232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.289247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.289258 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.391846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.391909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.391921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.391938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.391949 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.448489 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.448610 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:39 crc kubenswrapper[4762]: E0217 17:48:39.448704 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:11.448688261 +0000 UTC m=+103.093606271 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.493524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.493557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.493565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.493577 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.493585 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.595433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.595477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.595487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.595504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.595513 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.697494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.697553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.697563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.697581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.697594 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.799593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.799642 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.799651 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.799719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.799732 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.901352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.901425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.901438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.901458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:39 crc kubenswrapper[4762]: I0217 17:48:39.901471 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:39Z","lastTransitionTime":"2026-02-17T17:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.005534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.005583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.005596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.005612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.005642 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.029042 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 12:45:48.309749754 +0000 UTC Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.048258 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.108015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.108043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.108054 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.108070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.108080 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.210076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.210122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.210134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.210152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.210166 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.312313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.312595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.312678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.312754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.313048 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.415171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.415209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.415221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.415244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.415256 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.517187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.517225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.517237 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.517253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.517265 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.619781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.619819 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.619829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.619844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.619855 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.722889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.722923 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.722934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.722948 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.722958 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.825043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.825393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.825569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.825749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.826083 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.929111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.929143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.929151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.929164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:40 crc kubenswrapper[4762]: I0217 17:48:40.929173 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:40Z","lastTransitionTime":"2026-02-17T17:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.030299 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 12:01:47.962959137 +0000 UTC Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.031485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.031558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.031571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.031586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.031598 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.034739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.034739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:41 crc kubenswrapper[4762]: E0217 17:48:41.034833 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:41 crc kubenswrapper[4762]: E0217 17:48:41.034914 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.035455 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:41 crc kubenswrapper[4762]: E0217 17:48:41.035517 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.035464 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:41 crc kubenswrapper[4762]: E0217 17:48:41.035979 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.134012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.134048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.134056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.134069 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.134077 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.236847 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.236917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.236930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.236950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.236962 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.338706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.338755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.338767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.338783 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.338795 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.440544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.440579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.440587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.440601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.440610 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.441359 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/0.log" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.441401 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0f706d4-18a1-44c0-8913-b46af7876ee7" containerID="fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9" exitCode=1 Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.441429 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerDied","Data":"fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.441821 4762 scope.go:117] "RemoveContainer" containerID="fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.456804 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.478186 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.495739 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.510948 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.522508 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.538392 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.542433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.542464 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.542531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.542749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.542784 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.556287 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.568806 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.584682 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.596666 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.606719 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.618078 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.629983 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.641227 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.644822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.644884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.644898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.644918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.644938 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.654421 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.672699 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.681476 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.690573 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:41Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.747756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.747804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.747818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.747837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.747849 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.850363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.850422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.850434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.850450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.850461 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.953351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.953410 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.953421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.953437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:41 crc kubenswrapper[4762]: I0217 17:48:41.953453 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:41Z","lastTransitionTime":"2026-02-17T17:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.030752 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 18:35:16.353510285 +0000 UTC Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.056552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.056607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.056645 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.056664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.056676 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.159768 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.159886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.159911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.159940 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.159963 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.262118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.262212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.262230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.262250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.262266 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.365214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.365247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.365258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.365272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.365282 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.448286 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/0.log" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.448349 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerStarted","Data":"88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.463613 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.467246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.467287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.467302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.467325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.467344 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.485195 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.498339 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.510843 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.520069 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.531378 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.540061 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.552925 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.565227 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.569108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.569154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.569169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.569184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.569195 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.577329 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.593378 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.606070 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.622901 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.633663 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.645017 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.656047 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.665648 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.671100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.671142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.671153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.671169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.671179 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.678089 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.773833 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.773874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.773886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.773903 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.774182 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.877155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.877226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.877238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.877255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.877269 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.979873 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.979938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.979961 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.979989 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:42 crc kubenswrapper[4762]: I0217 17:48:42.980010 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:42Z","lastTransitionTime":"2026-02-17T17:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.031312 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:04:20.704509194 +0000 UTC Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.035783 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:43 crc kubenswrapper[4762]: E0217 17:48:43.035976 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.036104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:43 crc kubenswrapper[4762]: E0217 17:48:43.036239 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.036759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.036759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:43 crc kubenswrapper[4762]: E0217 17:48:43.036882 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:43 crc kubenswrapper[4762]: E0217 17:48:43.036967 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.082902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.082972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.082997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.083025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.083042 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.185468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.185535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.185560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.185589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.185613 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.288510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.288565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.288583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.288605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.288655 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.391685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.391754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.391817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.391858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.391881 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.495476 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.495571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.495579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.495591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.495599 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.598510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.598598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.598616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.598915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.598935 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.702320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.702372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.702388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.702413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.702452 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.805790 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.805852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.805869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.805894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.805911 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.908243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.908302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.908318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.908338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:43 crc kubenswrapper[4762]: I0217 17:48:43.908352 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:43Z","lastTransitionTime":"2026-02-17T17:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.011006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.011095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.011112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.011130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.011143 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.031710 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 23:39:19.387235907 +0000 UTC Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.114696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.114819 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.114839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.114862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.114878 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.218250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.218303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.218315 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.218332 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.218346 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.320746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.320796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.320810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.320835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.320851 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.423191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.423227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.423238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.423254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.423265 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.526204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.526244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.526252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.526264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.526272 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.628503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.628549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.628560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.628576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.628589 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.732109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.732164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.732182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.732203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.732218 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.835863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.835913 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.835925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.835943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.835955 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.938988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.939047 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.939060 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.939081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:44 crc kubenswrapper[4762]: I0217 17:48:44.939095 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:44Z","lastTransitionTime":"2026-02-17T17:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.031934 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 21:35:08.525691309 +0000 UTC Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.035619 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.035713 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.035768 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.035763 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.035616 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.035873 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.035942 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.036161 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.041048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.041076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.041087 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.041121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.041132 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.144379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.144417 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.144424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.144438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.144448 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.247014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.247077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.247087 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.247106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.247122 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.349658 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.349722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.349734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.349751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.349763 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.452804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.452863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.452875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.452893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.452907 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.556406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.556470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.556488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.556516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.556536 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.655138 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.655195 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.655208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.655230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.655247 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.670737 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.676114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.676170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.676185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.676210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.676225 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.688528 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.693106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.693141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.693153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.693175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.693187 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.713150 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.716785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.716845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.716860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.716880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.716892 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.730530 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.735105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.735163 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.735224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.735255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.735274 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.749962 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:45 crc kubenswrapper[4762]: E0217 17:48:45.750105 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.751661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.751695 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.751706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.751723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.751737 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.854113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.854148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.854156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.854170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.854180 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.956751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.956792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.956803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.956820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:45 crc kubenswrapper[4762]: I0217 17:48:45.956833 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:45Z","lastTransitionTime":"2026-02-17T17:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.032942 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:53:50.939712209 +0000 UTC Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.036221 4762 scope.go:117] "RemoveContainer" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.060326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.060372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.060385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.060403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.060413 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.163778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.163857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.163883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.163933 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.163959 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.266216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.266253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.266262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.266275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.266284 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.368671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.368716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.368733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.368749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.368759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.461905 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/2.log" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.464139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.464584 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.470623 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.470682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.470694 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.470711 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.470723 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.485781 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.496254 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.507734 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.520821 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.529782 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.548853 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.560254 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.572961 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.573222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.573246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.573256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.573273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.573284 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.583595 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.599289 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.611875 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.629811 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.646474 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.658133 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.669360 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.675956 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.676004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.676019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.676037 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.676048 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.679610 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.691424 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.701467 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.778786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.778824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.778835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.778849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.778859 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.882063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.882092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.882101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.882115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.882126 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.988208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.988258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.988271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.988287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:46 crc kubenswrapper[4762]: I0217 17:48:46.988302 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:46Z","lastTransitionTime":"2026-02-17T17:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.033391 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 04:21:19.393377769 +0000 UTC Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.035980 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:47 crc kubenswrapper[4762]: E0217 17:48:47.036151 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.036236 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.036381 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:47 crc kubenswrapper[4762]: E0217 17:48:47.036452 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.036506 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:47 crc kubenswrapper[4762]: E0217 17:48:47.036570 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:47 crc kubenswrapper[4762]: E0217 17:48:47.036373 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.091113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.091162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.091173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.091190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.091200 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.194531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.195067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.195261 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.195442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.195626 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.298552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.298922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.299038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.299133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.299222 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.402124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.402440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.402530 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.402673 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.402798 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.471017 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/3.log" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.471973 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/2.log" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.476168 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" exitCode=1 Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.476228 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.476267 4762 scope.go:117] "RemoveContainer" containerID="7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.477414 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:48:47 crc kubenswrapper[4762]: E0217 17:48:47.477785 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.496497 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.505494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.505566 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.505590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.505625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.505683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.513353 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.526890 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.539875 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.552397 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.566143 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.583171 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.606997 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd6cabdbaa4d4bcd268314019a96c627398f36aa00bd13f997ed21feaf96bab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:21Z\\\",\\\"message\\\":\\\".go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:48:21.852095 6442 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-f6zrt after 0 failed attempt(s)\\\\nI0217 17:48:21.852099 6442 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 17:48:21.852029 6442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0217 17:48:21.852105 6442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:21Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:21.852\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:47Z\\\",\\\"message\\\":\\\"wrpm in node crc\\\\nI0217 17:48:46.845521 6839 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0217 17:48:46.845526 6839 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0217 17:48:46.845524 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-wdzt7\\\\nF0217 17:48:46.845484 6839 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:46.845535 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.608472 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.608506 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.608515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.608532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.608542 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.623227 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.636103 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.645902 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.662483 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.674197 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.687489 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.700160 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.710370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.710404 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.710412 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.710425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.710435 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.713711 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.726382 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.750055 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:47Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.812706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.812749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.812760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.812774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.812788 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.915129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.915547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.915736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.915905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:47 crc kubenswrapper[4762]: I0217 17:48:47.916089 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:47Z","lastTransitionTime":"2026-02-17T17:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.018864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.018897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.018905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.018934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.018942 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.033719 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 04:41:30.473779232 +0000 UTC Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.122612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.122712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.122724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.122742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.122754 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.225295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.225335 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.225343 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.225357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.225368 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.327153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.327196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.327206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.327222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.327235 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.430702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.430736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.430746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.430759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.430786 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.481433 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/3.log" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.488972 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:48:48 crc kubenswrapper[4762]: E0217 17:48:48.489463 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.505682 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.521455 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533208 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.533387 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.547756 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.557891 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.570994 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.581343 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.590272 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.602289 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.610743 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.625809 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:47Z\\\",\\\"message\\\":\\\"wrpm in node crc\\\\nI0217 17:48:46.845521 6839 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0217 17:48:46.845526 6839 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0217 17:48:46.845524 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-wdzt7\\\\nF0217 17:48:46.845484 6839 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:46.845535 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.635761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.635789 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.635800 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.635816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.635826 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.636563 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.650985 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.661113 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.673686 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.682703 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.692356 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.704332 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:48Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.737864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.737896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.737929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.737943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.737954 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.841220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.841260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.841289 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.841302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.841312 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.944787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.945158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.945243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.945278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:48 crc kubenswrapper[4762]: I0217 17:48:48.945305 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:48Z","lastTransitionTime":"2026-02-17T17:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.034330 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 12:05:32.830500829 +0000 UTC Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.035721 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.035851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:49 crc kubenswrapper[4762]: E0217 17:48:49.036069 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.036109 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.036125 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:49 crc kubenswrapper[4762]: E0217 17:48:49.036299 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:49 crc kubenswrapper[4762]: E0217 17:48:49.036465 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:49 crc kubenswrapper[4762]: E0217 17:48:49.036607 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.048188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.048254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.048273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.048297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.048314 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.053270 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.071870 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.084712 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.096639 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.108481 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.121091 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.139609 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.150381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.150413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.150421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.150433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.150452 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.167175 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:47Z\\\",\\\"message\\\":\\\"wrpm in node crc\\\\nI0217 17:48:46.845521 6839 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0217 17:48:46.845526 6839 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0217 17:48:46.845524 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-wdzt7\\\\nF0217 17:48:46.845484 6839 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:46.845535 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.181599 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.193709 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.205982 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.222625 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.231319 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.251942 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.254035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.254059 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.254066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.254079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.254089 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.266892 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.281779 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.296349 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.308775 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.358025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.358064 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.358073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.358089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.358102 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.460560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.460683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.460714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.460749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.460772 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.564369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.564401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.564411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.564425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.564435 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.667221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.667275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.667294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.667315 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.667330 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.769676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.769726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.769737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.769754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.769765 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.873514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.873573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.873586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.873609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.873644 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.976937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.976985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.977002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.977024 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:49 crc kubenswrapper[4762]: I0217 17:48:49.977044 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:49Z","lastTransitionTime":"2026-02-17T17:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.034514 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 02:13:42.845232393 +0000 UTC Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.080210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.080280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.080307 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.080337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.080356 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.182882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.182929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.182942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.182964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.182978 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.285942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.286002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.286022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.286046 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.286064 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.388646 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.388738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.388750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.388770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.388783 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.490591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.490636 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.490681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.490699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.490712 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.592722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.592842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.592874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.592890 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.592901 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.697008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.697085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.697104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.697133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.697153 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.799828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.799887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.799899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.799920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.799932 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.903274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.903347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.903366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.903390 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:50 crc kubenswrapper[4762]: I0217 17:48:50.903407 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:50Z","lastTransitionTime":"2026-02-17T17:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.007843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.007902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.007919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.007941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.007956 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.035693 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:51 crc kubenswrapper[4762]: E0217 17:48:51.035945 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.035980 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.036077 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:33:29.063133917 +0000 UTC Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.036107 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:51 crc kubenswrapper[4762]: E0217 17:48:51.036177 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.036231 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:51 crc kubenswrapper[4762]: E0217 17:48:51.036368 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:51 crc kubenswrapper[4762]: E0217 17:48:51.036517 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.109657 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.109698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.109708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.109724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.109737 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.212440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.212498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.212514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.212538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.212557 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.315309 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.315365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.315382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.315403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.315417 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.418369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.418432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.418449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.418475 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.418496 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.521415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.521714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.521816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.521891 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.521971 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.624611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.624726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.624745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.624769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.624787 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.728282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.728718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.728871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.729012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.729136 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.832345 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.832428 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.832444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.832465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.832479 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.935484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.935540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.935589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.935617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:51 crc kubenswrapper[4762]: I0217 17:48:51.935659 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:51Z","lastTransitionTime":"2026-02-17T17:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.036784 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 10:53:34.437641641 +0000 UTC Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.038945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.038992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.039007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.039025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.039037 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.141513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.141564 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.141574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.141588 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.141600 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.243824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.243868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.243880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.243898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.243912 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.346742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.347203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.347285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.347369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.347445 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.449762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.449808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.449817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.449838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.449848 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.552589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.552654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.552663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.552677 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.552685 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.654692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.654745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.654762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.654779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.654789 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.757761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.757836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.757850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.757872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.757888 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.861306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.861391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.861407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.861423 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.861437 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.889684 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.889819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.889843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.889895 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.889849595 +0000 UTC m=+148.534767645 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.889960 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.889977 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.889989 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890036 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.89002181 +0000 UTC m=+148.534939820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.890034 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.890093 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890139 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890207 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890229 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890278 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890170 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890327 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.890292828 +0000 UTC m=+148.535210838 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890365 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.89034864 +0000 UTC m=+148.535266860 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: E0217 17:48:52.890382 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.890374701 +0000 UTC m=+148.535292711 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.964091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.964136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.964148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.964167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:52 crc kubenswrapper[4762]: I0217 17:48:52.964179 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:52Z","lastTransitionTime":"2026-02-17T17:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.035784 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.035887 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.035808 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.035893 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:53 crc kubenswrapper[4762]: E0217 17:48:53.035995 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:53 crc kubenswrapper[4762]: E0217 17:48:53.036116 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:53 crc kubenswrapper[4762]: E0217 17:48:53.036222 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:53 crc kubenswrapper[4762]: E0217 17:48:53.036475 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.036953 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 03:27:48.284999106 +0000 UTC Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.066741 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.066784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.066793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.066808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.066818 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.170139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.170216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.170236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.170272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.170289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.273180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.273225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.273235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.273256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.273268 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.376522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.376587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.376608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.376681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.376704 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.479346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.479431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.479453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.479485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.479508 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.582779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.582871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.582896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.582932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.582962 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.686730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.686775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.686790 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.686808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.686820 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.790164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.790234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.790249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.790274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.790292 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.896178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.896247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.896264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.896288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.896308 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.999417 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.999465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.999475 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.999489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:53 crc kubenswrapper[4762]: I0217 17:48:53.999498 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:53Z","lastTransitionTime":"2026-02-17T17:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.037235 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 13:35:34.453825887 +0000 UTC Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.102011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.102457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.102524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.102594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.102695 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.206054 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.206126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.206152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.206184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.206207 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.309393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.309605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.309614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.309640 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.309654 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.412378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.412429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.412441 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.412463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.412476 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.515107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.515161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.515185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.515210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.515225 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.617902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.617980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.617998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.618025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.618048 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.720209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.720245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.720254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.720269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.720279 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.823546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.823602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.823614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.823656 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.823672 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.926551 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.926607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.926654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.926674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:54 crc kubenswrapper[4762]: I0217 17:48:54.926687 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:54Z","lastTransitionTime":"2026-02-17T17:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.028840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.028894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.028907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.028962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.028978 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.035214 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.035270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.035221 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.035214 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.035351 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.035424 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.035479 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.035581 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.037815 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:45:31.432987103 +0000 UTC Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.131939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.131990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.132002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.132020 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.132035 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.235340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.235386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.235397 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.235415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.235426 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.338856 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.338909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.338927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.338950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.338968 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.441487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.441518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.441526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.441539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.441548 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.543808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.544147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.544325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.544546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.544783 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.648107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.648170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.648186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.648210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.648224 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.751101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.751170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.751184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.751209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.751224 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.790159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.790257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.790275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.790304 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.790321 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.808614 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.812938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.813082 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.813157 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.813222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.813278 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.825907 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.830395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.830670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.830772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.830840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.830910 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.844932 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.849046 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.849187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.849410 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.849586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.849782 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.864521 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.869909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.870099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.870186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.870254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.870317 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.887013 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:55Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:55 crc kubenswrapper[4762]: E0217 17:48:55.887197 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.889568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.889624 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.889660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.889681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.889695 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.992418 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.992731 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.992825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.992900 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:55 crc kubenswrapper[4762]: I0217 17:48:55.992967 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:55Z","lastTransitionTime":"2026-02-17T17:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.038801 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 03:39:59.508332904 +0000 UTC Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.128918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.129044 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.129067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.129092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.129109 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.231926 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.231958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.231966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.231981 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.231990 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.334107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.334174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.334188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.334202 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.334212 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.436766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.436837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.436849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.436868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.436881 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.539985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.540029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.540040 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.540055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.540066 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.643029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.643099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.643113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.643132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.643146 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.746870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.746947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.746965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.746991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.747019 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.850354 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.850399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.850409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.850426 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.850436 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.952456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.952521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.952534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.952549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:56 crc kubenswrapper[4762]: I0217 17:48:56.952558 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:56Z","lastTransitionTime":"2026-02-17T17:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.035598 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.035673 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.035733 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:57 crc kubenswrapper[4762]: E0217 17:48:57.035935 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.035990 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:57 crc kubenswrapper[4762]: E0217 17:48:57.036073 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:57 crc kubenswrapper[4762]: E0217 17:48:57.036159 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:57 crc kubenswrapper[4762]: E0217 17:48:57.036316 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.038983 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:57:35.45155518 +0000 UTC Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.055008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.055240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.055303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.055363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.055419 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.157478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.157538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.157555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.157577 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.157593 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.260705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.260778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.260788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.260803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.260812 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.363659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.363702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.363714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.363746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.363756 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.466550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.467061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.467270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.467496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.467727 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.570282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.570355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.570372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.570397 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.570414 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.673182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.673291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.673306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.673327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.673339 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.776831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.776925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.776958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.776992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.777016 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.879545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.879585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.879598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.879613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.879641 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.981843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.981899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.981915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.981937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:57 crc kubenswrapper[4762]: I0217 17:48:57.981953 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:57Z","lastTransitionTime":"2026-02-17T17:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.039154 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 21:02:31.051119264 +0000 UTC Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.085053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.085133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.085159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.085192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.085220 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.188775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.188842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.188852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.188873 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.188885 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.292428 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.292474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.292484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.292501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.292511 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.395686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.395768 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.395802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.395832 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.395887 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.498832 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.498899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.498922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.498947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.498967 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.602263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.602684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.602822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.602951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.603071 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.706401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.706450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.706464 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.706487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.706501 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.809565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.809613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.809670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.809701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.809723 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.913193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.913254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.913271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.913296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:58 crc kubenswrapper[4762]: I0217 17:48:58.913311 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:58Z","lastTransitionTime":"2026-02-17T17:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.015350 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.015405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.015421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.015444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.015461 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.035146 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:48:59 crc kubenswrapper[4762]: E0217 17:48:59.035324 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.035574 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:48:59 crc kubenswrapper[4762]: E0217 17:48:59.035746 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.035584 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:48:59 crc kubenswrapper[4762]: E0217 17:48:59.035992 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.036085 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:48:59 crc kubenswrapper[4762]: E0217 17:48:59.036170 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.039861 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 01:02:08.145637888 +0000 UTC Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.059502 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.077560 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k2xfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f706d4-18a1-44c0-8913-b46af7876ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:40Z\\\",\\\"message\\\":\\\"2026-02-17T17:47:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9\\\\n2026-02-17T17:47:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ecd590cf-4328-479b-a6fa-1859a144dab9 to /host/opt/cni/bin/\\\\n2026-02-17T17:47:55Z [verbose] multus-daemon started\\\\n2026-02-17T17:47:55Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:48:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82z4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k2xfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.106824 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e901c69-4b38-4f54-9811-83bd34c46a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:48:47Z\\\",\\\"message\\\":\\\"wrpm in node crc\\\\nI0217 17:48:46.845521 6839 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0217 17:48:46.845526 6839 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0217 17:48:46.845524 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-wdzt7\\\\nF0217 17:48:46.845484 6839 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:46Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:48:46.845535 6839 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:48:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-f6zrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.118304 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.118372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.118389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.118415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.118434 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.123143 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fzb7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46ea121f-8e60-4e68-af96-9c972a27988b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ed96ab55257826aa99ed006e74501e397a4cac7c08429d10a29d40525068aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h9ng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fzb7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.146724 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efb95f60944a3e74789858a6d912df36e1b3863bbbb478636f2cf9666591e14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.158611 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.167722 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgv5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166682c4-697f-453c-b43a-e649aaeb0c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6ad4c3ac1cfb95def1e1ac30b5af566909e70fa714cb2cb7ba089a01dbaaa7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmmvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgv5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.180554 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kg68g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"132714a2-f72f-40f0-8156-33fa78780072\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://783630c4dad03f0e4c85a665804ed66660b4221c7d26eb747f8c4869621d505a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fe22918a3fce33dcc184b12a44e284bad7705c7dd2072d2aacb83ee2d217a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffd31b93d641179048adae94e4527fa35fa40bb3390b283580a7e888aff9e9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2850c0e2c33b6b55107a5178d5ca439a0dccd1a02edaf53fdeee745d83206e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9be93fc11a4f2ab566afe95dc08368135d97623a51246fc236ebcbc1453a6e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a2458bbb4d64f6e1ff1ef2a93a332167552496a0df789f0be039acae5641d40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ce365f7c0a7609de4cf41701b06a6e54491909d33afeea049edb4b446f817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5fmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kg68g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.190518 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.199454 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7389b1a3-5839-49b0-97e8-2adcbe0fd491\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32b9866a235306fef252b071ab8ec7f271b41f87060363b442ceec533872e75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5bmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jb9kz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.211178 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c17536-0131-444a-a138-5f69ddbe8aff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26512d433a04e5d987559c453aaf63dde1aca58f571402aa89be7d5b3f3517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11616936f55ba8e838afeddf04b054dc8cdb7b638aa34e0634d706d4e9ae7e60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db7fa33722a413ee7dd47e63ceff30ca6dca4f08bd73da4eadb714d2c383936\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221098 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221128 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.221757 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb579a01-d363-40bb-92c5-f7a65bfb0cbd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://643b59f0a7fd172671b214a33927f34836693955a097e746a7b1e37c37c8b284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7ab5102ed3a59d65594d7e2786c95471c80673d23be0a1200c9cbc4946529b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2610e70b1f5aa35731f2b1de095f4a8b7433b99f423a92dd665546e5332f25f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a41daff5c1cd273e4ce59dc1931eade8fd551da29d222845e4f68f882708f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.233228 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fffc6920c9c3134e04ba49263ddbddc08084d65a85dc9d28f3ea0f9f1b4a9753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19142c72d89821d35d4fc15fae1d317cfb962aa7b3ca4eb056d6a157193333f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.244112 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bb87d75-4230-44b9-8ee8-7aff6d051904\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtx9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdzt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.257487 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9302ca52-ca46-4bc4-8c30-c436af0f9588\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:47:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 17:47:42.523093 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:47:42.525572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1382788743/tls.crt::/tmp/serving-cert-1382788743/tls.key\\\\\\\"\\\\nI0217 17:47:48.093235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:47:48.095901 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:47:48.095919 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:47:48.095943 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:47:48.095948 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:47:48.099758 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:47:48.099791 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:47:48.099802 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:47:48.099971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:47:48.099974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:47:48.099978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:47:48.100006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:47:48.103948 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.268073 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b61e8f-e028-4031-b317-f843531a7cdc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a916cd693cf1a461d11f5d121000d8ed41a4fe15b64cc8ce87e0c43eeb0ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21f782794365f18527b6f1a788d3abe132e7478c443178d11adaaf1be8da82c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:47:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.279339 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:47:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9cc93521fbfe8316bb7d87999177e5c0eed0699f999f621b834c059b910254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.291159 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7a60e8f-4096-4b6c-bd25-5b5fa939c4d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:48:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62d7c8fe32e9e418d4a644792404d5a3e958ec02e34851e1ac0db94f0c17478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1a3596f606491a36c9049fef69378e77071831ab4b760be26a305beda813f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:48:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lwrpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:48:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.323411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.323452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.323462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.323478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.323489 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.426346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.426857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.426987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.427110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.427239 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.528906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.529361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.529374 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.529392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.529403 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.631548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.631672 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.631700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.631733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.631759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.735430 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.735492 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.735515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.735544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.735565 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.838615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.838743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.838762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.838785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.838802 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.942135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.942170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.942180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.942195 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:48:59 crc kubenswrapper[4762]: I0217 17:48:59.942204 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:48:59Z","lastTransitionTime":"2026-02-17T17:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.040118 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:13:03.392394897 +0000 UTC Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.045093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.045139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.045155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.045174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.045188 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.148063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.148123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.148139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.148161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.148177 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.250911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.250966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.250979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.250997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.251010 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.353705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.353744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.353755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.353773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.353785 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.455987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.456075 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.456100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.456131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.456154 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.559913 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.560014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.560033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.560057 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.560075 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.662528 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.662563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.662570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.662584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.662593 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.764923 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.765009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.765036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.765067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.765089 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.868346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.868402 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.868420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.868444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.868463 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.970785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.970844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.970863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.970886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:00 crc kubenswrapper[4762]: I0217 17:49:00.970904 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:00Z","lastTransitionTime":"2026-02-17T17:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.035462 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.035587 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:01 crc kubenswrapper[4762]: E0217 17:49:01.035681 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.036131 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:01 crc kubenswrapper[4762]: E0217 17:49:01.036199 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.036198 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:01 crc kubenswrapper[4762]: E0217 17:49:01.036326 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:01 crc kubenswrapper[4762]: E0217 17:49:01.036460 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.036927 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:49:01 crc kubenswrapper[4762]: E0217 17:49:01.037252 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.041260 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:50:04.978088286 +0000 UTC Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.072900 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.073369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.073393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.073419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.073438 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.176278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.176318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.176328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.176348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.176358 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.279170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.279204 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.279213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.279233 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.279251 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.393052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.393093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.393105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.393121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.393132 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.495809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.495842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.495851 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.495867 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.495878 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.598244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.598292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.598305 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.598324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.598351 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.701568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.701613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.701642 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.701660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.701675 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.804389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.804739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.804821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.805013 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.805092 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.906912 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.907366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.907510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.907666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:01 crc kubenswrapper[4762]: I0217 17:49:01.907965 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:01Z","lastTransitionTime":"2026-02-17T17:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.011245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.011289 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.011300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.011320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.011333 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.041964 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:05:12.47009482 +0000 UTC Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.114338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.114401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.114414 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.114429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.114440 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.217205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.217247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.217259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.217274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.217286 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.320759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.320810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.320825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.320842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.320852 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.423378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.423773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.423799 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.423817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.423828 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.527067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.527104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.527114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.527129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.527139 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.629562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.629604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.629616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.629665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.629682 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.732426 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.732490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.732512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.732539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.732560 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.835081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.835125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.835137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.835153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.835164 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.937605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.937671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.937685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.937699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:02 crc kubenswrapper[4762]: I0217 17:49:02.937709 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:02Z","lastTransitionTime":"2026-02-17T17:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.035585 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.035680 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.035704 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:03 crc kubenswrapper[4762]: E0217 17:49:03.035763 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.035787 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:03 crc kubenswrapper[4762]: E0217 17:49:03.035920 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:03 crc kubenswrapper[4762]: E0217 17:49:03.036041 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:03 crc kubenswrapper[4762]: E0217 17:49:03.036198 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.040911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.040954 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.040967 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.040983 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.040995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.042231 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:21:16.158704857 +0000 UTC Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.143300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.143389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.143406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.143430 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.143475 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.246555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.246587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.246595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.246608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.246642 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.349181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.349218 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.349229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.349246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.349258 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.451834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.451901 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.451923 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.451951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.451974 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.555516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.555582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.555606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.555719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.555779 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.660590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.660706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.660732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.660764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.660786 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.763798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.763852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.763864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.763880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.763891 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.866230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.866277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.866288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.866304 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.866316 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.969707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.969758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.969776 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.969798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:03 crc kubenswrapper[4762]: I0217 17:49:03.969816 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:03Z","lastTransitionTime":"2026-02-17T17:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.043490 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 02:36:36.650131556 +0000 UTC Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.073696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.073776 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.073798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.073833 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.073859 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.176537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.176604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.176659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.176694 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.176718 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.279785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.279843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.279860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.279885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.279904 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.382290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.382325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.382338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.382352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.382361 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.484836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.484882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.484893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.484915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.484926 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.588136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.588181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.588193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.588209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.588225 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.691091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.691476 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.691581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.691839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.691947 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.794561 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.794596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.794605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.794618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.794647 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.897434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.897484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.897496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.897513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:04 crc kubenswrapper[4762]: I0217 17:49:04.897526 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:04.999681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:04.999723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:04.999733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:04.999747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:04.999758 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:04Z","lastTransitionTime":"2026-02-17T17:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.034738 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.034848 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:05 crc kubenswrapper[4762]: E0217 17:49:05.034904 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.034925 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.034880 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:05 crc kubenswrapper[4762]: E0217 17:49:05.035020 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:05 crc kubenswrapper[4762]: E0217 17:49:05.035134 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:05 crc kubenswrapper[4762]: E0217 17:49:05.035204 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.043759 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:03:13.233125233 +0000 UTC Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.103055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.103115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.103134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.103159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.103177 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.205941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.205987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.205998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.206014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.206025 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.308409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.308445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.308456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.308469 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.308480 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.410896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.410930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.410938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.410951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.410960 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.513883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.513917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.513925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.513940 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.513949 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.616783 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.617375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.617614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.617882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.618092 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.721124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.721169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.721179 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.721197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.721212 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.824443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.824484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.824495 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.824511 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.824522 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.927456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.927504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.927518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.927534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:05 crc kubenswrapper[4762]: I0217 17:49:05.927544 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:05Z","lastTransitionTime":"2026-02-17T17:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.030419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.030736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.030834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.030937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.031010 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.044676 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:32:21.337848704 +0000 UTC Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.133787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.133835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.133848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.133866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.133878 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.181462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.181542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.181556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.181571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.181580 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.196324 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:49:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.200043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.200096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.200111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.200131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.200144 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.212219 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:49:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.215400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.215445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.215457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.215472 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.215486 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.227522 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:49:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.231205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.231239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.231248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.231263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.231273 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.242872 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:49:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.245705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.245777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.245788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.245804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.245815 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.256033 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:49:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"60e49c4c-5e4b-4bf6-9895-1e12c94f3d77\\\",\\\"systemUUID\\\":\\\"1dc8183f-0bbf-41f8-ae92-b64e8a8697b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:49:06Z is after 2025-08-24T17:21:41Z" Feb 17 17:49:06 crc kubenswrapper[4762]: E0217 17:49:06.256149 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.257689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.257719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.257732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.257748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.257759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.360468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.360527 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.360541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.360562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.360577 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.462967 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.463036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.463048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.463085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.463100 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.565916 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.565965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.565975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.565991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.566003 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.668758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.668823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.668840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.668866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.668883 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.774013 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.774069 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.774091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.774114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.774129 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.876678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.876734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.876747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.876770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.876782 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.980070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.980150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.980174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.980206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:06 crc kubenswrapper[4762]: I0217 17:49:06.980229 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:06Z","lastTransitionTime":"2026-02-17T17:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.035546 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.035676 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.035672 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:07 crc kubenswrapper[4762]: E0217 17:49:07.035800 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.036006 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:07 crc kubenswrapper[4762]: E0217 17:49:07.036070 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:07 crc kubenswrapper[4762]: E0217 17:49:07.036358 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:07 crc kubenswrapper[4762]: E0217 17:49:07.036481 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.045289 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:34:54.761531565 +0000 UTC Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.083234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.083279 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.083288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.083306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.083317 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.186232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.186295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.186312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.186336 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.186353 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.288738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.288794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.288809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.288828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.288842 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.391924 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.392006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.392032 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.392058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.392076 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.494513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.494545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.494554 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.494567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.494577 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.597549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.597590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.597599 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.597615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.597643 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.700291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.700347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.700363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.700383 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.700395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.803101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.803148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.803162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.803182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.803196 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.906424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.906494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.906518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.906550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:07 crc kubenswrapper[4762]: I0217 17:49:07.906571 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:07Z","lastTransitionTime":"2026-02-17T17:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.009391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.009468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.009485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.009509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.009526 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.046008 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 22:29:31.944176379 +0000 UTC Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.112393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.112517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.112542 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.112574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.112592 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.215219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.215279 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.215297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.215320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.215336 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.317795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.317844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.317855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.317868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.317876 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.420797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.420845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.420857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.420876 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.420888 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.523012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.523051 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.523061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.523077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.523090 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.625198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.625259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.625273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.625289 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.625300 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.727726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.727755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.727765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.727804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.727816 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.830659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.830754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.830829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.830857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.830930 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.934001 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.934073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.934101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.934128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:08 crc kubenswrapper[4762]: I0217 17:49:08.934147 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:08Z","lastTransitionTime":"2026-02-17T17:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.034913 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.034968 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:09 crc kubenswrapper[4762]: E0217 17:49:09.035041 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:09 crc kubenswrapper[4762]: E0217 17:49:09.035171 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.035463 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036405 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.036887 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:09 crc kubenswrapper[4762]: E0217 17:49:09.039313 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:09 crc kubenswrapper[4762]: E0217 17:49:09.039387 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.046348 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 10:21:53.44624299 +0000 UTC Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.056510 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.086165 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.08614496 podStartE2EDuration="52.08614496s" podCreationTimestamp="2026-02-17 17:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.085187832 +0000 UTC m=+100.730105852" watchObservedRunningTime="2026-02-17 17:49:09.08614496 +0000 UTC m=+100.731062970" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.086412 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=81.086407277 podStartE2EDuration="1m21.086407277s" podCreationTimestamp="2026-02-17 17:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.069596458 +0000 UTC m=+100.714514478" watchObservedRunningTime="2026-02-17 17:49:09.086407277 +0000 UTC m=+100.731325287" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.139123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.139242 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.139257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.139287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.139299 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.163189 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podStartSLOduration=76.163167761 podStartE2EDuration="1m16.163167761s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.135945289 +0000 UTC m=+100.780863309" watchObservedRunningTime="2026-02-17 17:49:09.163167761 +0000 UTC m=+100.808085791" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.176855 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.176828118 podStartE2EDuration="1m21.176828118s" podCreationTimestamp="2026-02-17 17:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.164311024 +0000 UTC m=+100.809229064" watchObservedRunningTime="2026-02-17 17:49:09.176828118 +0000 UTC m=+100.821746148" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.188299 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.188281262 podStartE2EDuration="29.188281262s" podCreationTimestamp="2026-02-17 17:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.177390485 +0000 UTC m=+100.822308535" watchObservedRunningTime="2026-02-17 17:49:09.188281262 +0000 UTC m=+100.833199272" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.199818 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lwrpm" podStartSLOduration=76.199799587 podStartE2EDuration="1m16.199799587s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.199137447 +0000 UTC m=+100.844055467" watchObservedRunningTime="2026-02-17 17:49:09.199799587 +0000 UTC m=+100.844717607" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.241547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.241598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.241609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.241640 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.241653 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.264579 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k2xfd" podStartSLOduration=76.264555841 podStartE2EDuration="1m16.264555841s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.236717401 +0000 UTC m=+100.881635411" watchObservedRunningTime="2026-02-17 17:49:09.264555841 +0000 UTC m=+100.909473851" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.309703 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zgv5j" podStartSLOduration=77.309678244 podStartE2EDuration="1m17.309678244s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.309379935 +0000 UTC m=+100.954297945" watchObservedRunningTime="2026-02-17 17:49:09.309678244 +0000 UTC m=+100.954596254" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.338057 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kg68g" podStartSLOduration=76.338035029 podStartE2EDuration="1m16.338035029s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.328349547 +0000 UTC m=+100.973267557" watchObservedRunningTime="2026-02-17 17:49:09.338035029 +0000 UTC m=+100.982953049" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.344235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.344444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.344578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.344707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.344826 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.446415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.446477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.446490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.446510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.446523 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.549397 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.549470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.549490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.549514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.549556 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.651840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.651876 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.651886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.651899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.651909 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.754178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.754251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.754269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.754294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.754310 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.857291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.857353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.857375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.857403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.857424 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.960183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.960263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.960284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.960316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:09 crc kubenswrapper[4762]: I0217 17:49:09.960338 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:09Z","lastTransitionTime":"2026-02-17T17:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.046910 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 06:56:14.221352602 +0000 UTC Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.063320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.063375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.063393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.063419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.063438 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.166465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.166496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.166504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.166517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.166526 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.268848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.268879 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.268889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.268902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.268913 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.371351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.371448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.371470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.371499 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.371516 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.474575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.474650 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.474684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.474729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.474751 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.577385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.577443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.577458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.577482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.577500 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.681717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.681962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.681984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.682015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.682039 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.785935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.786003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.786026 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.786061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.786085 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.889081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.889122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.889130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.889143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.889155 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.992844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.992908 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.992927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.992951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:10 crc kubenswrapper[4762]: I0217 17:49:10.992967 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:10Z","lastTransitionTime":"2026-02-17T17:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.034863 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.034886 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.034885 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.034962 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.035094 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.035217 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.035333 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.035445 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.047222 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:39:24.709618652 +0000 UTC Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.095549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.095601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.095616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.095663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.095681 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.198581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.198617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.198656 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.198672 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.198683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.301998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.302061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.302079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.302106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.302125 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.450199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.450259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.450275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.450295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.450310 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.491768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.492148 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:49:11 crc kubenswrapper[4762]: E0217 17:49:11.492225 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs podName:6bb87d75-4230-44b9-8ee8-7aff6d051904 nodeName:}" failed. No retries permitted until 2026-02-17 17:50:15.492201145 +0000 UTC m=+167.137119165 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs") pod "network-metrics-daemon-wdzt7" (UID: "6bb87d75-4230-44b9-8ee8-7aff6d051904") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.552953 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.553032 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.553049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.553075 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.553092 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.656459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.656532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.656550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.656574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.656594 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.760109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.760210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.760231 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.760257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.760275 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.863761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.863883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.863910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.863941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.863961 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.966713 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.966769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.966780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.966798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:11 crc kubenswrapper[4762]: I0217 17:49:11.966810 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:11Z","lastTransitionTime":"2026-02-17T17:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.047347 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:36:02.63979009 +0000 UTC Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.069651 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.069711 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.069725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.069742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.069753 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.173407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.173468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.173485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.173509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.173527 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.276391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.276446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.276460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.276482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.276498 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.378838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.378897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.378914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.378937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.378954 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.487777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.487830 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.487850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.487880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.487896 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.590691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.590771 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.590797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.590831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.590857 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.693451 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.693502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.693514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.693531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.693547 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.796808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.796869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.796887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.796910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.796930 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.900027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.900095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.900114 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.900142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:12 crc kubenswrapper[4762]: I0217 17:49:12.900164 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:12Z","lastTransitionTime":"2026-02-17T17:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.002890 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.002932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.002943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.002960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.002972 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.035916 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:13 crc kubenswrapper[4762]: E0217 17:49:13.036056 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.035916 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.036138 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:13 crc kubenswrapper[4762]: E0217 17:49:13.036177 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.035913 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:13 crc kubenswrapper[4762]: E0217 17:49:13.036350 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:13 crc kubenswrapper[4762]: E0217 17:49:13.036405 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.047971 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 19:48:09.883616675 +0000 UTC Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.105678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.105983 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.106101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.106222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.106475 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.210034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.210108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.210127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.210147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.210166 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.312730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.312778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.312792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.312814 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.312826 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.415548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.415601 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.415619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.415674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.415693 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.518467 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.518520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.518538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.518560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.518577 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.621939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.622314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.622456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.622600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.622755 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.726321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.726399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.726425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.726458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.726481 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.829281 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.829359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.829382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.829411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.829434 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.931657 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.931706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.931718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.931737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:13 crc kubenswrapper[4762]: I0217 17:49:13.931748 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:13Z","lastTransitionTime":"2026-02-17T17:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.034227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.034263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.034273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.034289 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.034301 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.048792 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 03:12:20.507425242 +0000 UTC Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.138305 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.138348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.138363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.138385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.138400 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.241881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.241951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.241966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.241994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.242010 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.344124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.344180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.344200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.344225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.344242 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.446698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.447029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.447122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.447224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.447327 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.550132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.550393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.550509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.550582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.550677 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.653430 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.653496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.653508 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.653535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.653548 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.756383 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.756434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.756448 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.756470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.756486 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.860193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.860247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.860263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.860286 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.860299 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.963137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.963191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.963213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.963238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:14 crc kubenswrapper[4762]: I0217 17:49:14.963256 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:14Z","lastTransitionTime":"2026-02-17T17:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.035663 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:15 crc kubenswrapper[4762]: E0217 17:49:15.035863 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.036151 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:15 crc kubenswrapper[4762]: E0217 17:49:15.036226 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.036365 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:15 crc kubenswrapper[4762]: E0217 17:49:15.036431 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.036492 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:15 crc kubenswrapper[4762]: E0217 17:49:15.036685 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.038361 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:49:15 crc kubenswrapper[4762]: E0217 17:49:15.038776 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-f6zrt_openshift-ovn-kubernetes(8e901c69-4b38-4f54-9811-83bd34c46a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.049727 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:52:03.020324007 +0000 UTC Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.066346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.066412 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.066433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.066460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.066479 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.169174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.169254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.169295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.169318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.169334 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.272337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.272375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.272386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.272400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.272412 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.375488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.375544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.375555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.375570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.375581 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.479162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.479248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.479290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.479316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.479329 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.582532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.582587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.582604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.582675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.582702 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.686200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.686318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.686337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.686369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.686392 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.789897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.789966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.789990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.790022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.790046 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.892522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.892616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.892679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.892700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.892711 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.995817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.995857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.995866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.995879 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:15 crc kubenswrapper[4762]: I0217 17:49:15.995889 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:15Z","lastTransitionTime":"2026-02-17T17:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.050795 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 10:05:51.782446574 +0000 UTC Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.099085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.099133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.099166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.099185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.099196 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:16Z","lastTransitionTime":"2026-02-17T17:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.202100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.202403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.202536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.202747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.202943 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:16Z","lastTransitionTime":"2026-02-17T17:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.305615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.305960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.306161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.306392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.306534 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:16Z","lastTransitionTime":"2026-02-17T17:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.332333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.332373 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.332391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.332411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.332426 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:49:16Z","lastTransitionTime":"2026-02-17T17:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.392917 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fzb7v" podStartSLOduration=84.392892411 podStartE2EDuration="1m24.392892411s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:09.338650857 +0000 UTC m=+100.983568867" watchObservedRunningTime="2026-02-17 17:49:16.392892411 +0000 UTC m=+108.037810451" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.393798 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774"] Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.394315 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.398966 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.398993 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.399202 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.399227 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450357 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.450333202 podStartE2EDuration="7.450333202s" podCreationTimestamp="2026-02-17 17:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:16.448561611 +0000 UTC m=+108.093479631" watchObservedRunningTime="2026-02-17 17:49:16.450333202 +0000 UTC m=+108.095251222" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450412 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d48589-b2d5-4cc6-a097-90ac1d203ffb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89d48589-b2d5-4cc6-a097-90ac1d203ffb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.450764 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d48589-b2d5-4cc6-a097-90ac1d203ffb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552039 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89d48589-b2d5-4cc6-a097-90ac1d203ffb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552113 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d48589-b2d5-4cc6-a097-90ac1d203ffb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552147 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552222 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d48589-b2d5-4cc6-a097-90ac1d203ffb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.552392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/89d48589-b2d5-4cc6-a097-90ac1d203ffb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.553154 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89d48589-b2d5-4cc6-a097-90ac1d203ffb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.562283 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d48589-b2d5-4cc6-a097-90ac1d203ffb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.568545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d48589-b2d5-4cc6-a097-90ac1d203ffb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2g774\" (UID: \"89d48589-b2d5-4cc6-a097-90ac1d203ffb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: I0217 17:49:16.719069 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" Feb 17 17:49:16 crc kubenswrapper[4762]: W0217 17:49:16.741177 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d48589_b2d5_4cc6_a097_90ac1d203ffb.slice/crio-96b3b274954fe2190c32c215ac78abafa7b7e3120ac676d1e63d3b4a3c560815 WatchSource:0}: Error finding container 96b3b274954fe2190c32c215ac78abafa7b7e3120ac676d1e63d3b4a3c560815: Status 404 returned error can't find the container with id 96b3b274954fe2190c32c215ac78abafa7b7e3120ac676d1e63d3b4a3c560815 Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.035916 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.036079 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:17 crc kubenswrapper[4762]: E0217 17:49:17.036343 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.036476 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.036567 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:17 crc kubenswrapper[4762]: E0217 17:49:17.036604 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:17 crc kubenswrapper[4762]: E0217 17:49:17.036744 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:17 crc kubenswrapper[4762]: E0217 17:49:17.036797 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.051699 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:36:10.026033173 +0000 UTC Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.051763 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.061542 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.575243 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" event={"ID":"89d48589-b2d5-4cc6-a097-90ac1d203ffb","Type":"ContainerStarted","Data":"ac0bbd87ee3fbba9343ad474e7aa21ef8c4da78541a89a370299f6bede14db5c"} Feb 17 17:49:17 crc kubenswrapper[4762]: I0217 17:49:17.575285 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" event={"ID":"89d48589-b2d5-4cc6-a097-90ac1d203ffb","Type":"ContainerStarted","Data":"96b3b274954fe2190c32c215ac78abafa7b7e3120ac676d1e63d3b4a3c560815"} Feb 17 17:49:19 crc kubenswrapper[4762]: I0217 17:49:19.035225 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:19 crc kubenswrapper[4762]: I0217 17:49:19.036551 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:19 crc kubenswrapper[4762]: I0217 17:49:19.036641 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:19 crc kubenswrapper[4762]: I0217 17:49:19.036724 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:19 crc kubenswrapper[4762]: E0217 17:49:19.036469 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:19 crc kubenswrapper[4762]: E0217 17:49:19.037439 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:19 crc kubenswrapper[4762]: E0217 17:49:19.037481 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:19 crc kubenswrapper[4762]: E0217 17:49:19.037823 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:21 crc kubenswrapper[4762]: I0217 17:49:21.035750 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:21 crc kubenswrapper[4762]: I0217 17:49:21.035827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:21 crc kubenswrapper[4762]: I0217 17:49:21.035884 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:21 crc kubenswrapper[4762]: E0217 17:49:21.037478 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:21 crc kubenswrapper[4762]: E0217 17:49:21.037563 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:21 crc kubenswrapper[4762]: I0217 17:49:21.035925 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:21 crc kubenswrapper[4762]: E0217 17:49:21.037693 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:21 crc kubenswrapper[4762]: E0217 17:49:21.037345 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:23 crc kubenswrapper[4762]: I0217 17:49:23.035858 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:23 crc kubenswrapper[4762]: I0217 17:49:23.035906 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:23 crc kubenswrapper[4762]: E0217 17:49:23.036001 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:23 crc kubenswrapper[4762]: I0217 17:49:23.036024 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:23 crc kubenswrapper[4762]: E0217 17:49:23.036122 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:23 crc kubenswrapper[4762]: E0217 17:49:23.036223 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:23 crc kubenswrapper[4762]: I0217 17:49:23.036376 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:23 crc kubenswrapper[4762]: E0217 17:49:23.036441 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:25 crc kubenswrapper[4762]: I0217 17:49:25.035792 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:25 crc kubenswrapper[4762]: I0217 17:49:25.035944 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:25 crc kubenswrapper[4762]: I0217 17:49:25.036178 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:25 crc kubenswrapper[4762]: E0217 17:49:25.036173 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:25 crc kubenswrapper[4762]: I0217 17:49:25.036251 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:25 crc kubenswrapper[4762]: E0217 17:49:25.036297 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:25 crc kubenswrapper[4762]: E0217 17:49:25.036353 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:25 crc kubenswrapper[4762]: E0217 17:49:25.036378 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.035524 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.035612 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:27 crc kubenswrapper[4762]: E0217 17:49:27.036175 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.035741 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.035704 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:27 crc kubenswrapper[4762]: E0217 17:49:27.036313 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:27 crc kubenswrapper[4762]: E0217 17:49:27.036492 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:27 crc kubenswrapper[4762]: E0217 17:49:27.036726 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.607342 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/1.log" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.607917 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/0.log" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.607965 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0f706d4-18a1-44c0-8913-b46af7876ee7" containerID="88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0" exitCode=1 Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.607992 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerDied","Data":"88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0"} Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.608033 4762 scope.go:117] "RemoveContainer" containerID="fc8dee18c3b81bd0eea184e04177ee737b14f21bf8fa7e9017c80e8f043336f9" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.608733 4762 scope.go:117] "RemoveContainer" containerID="88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0" Feb 17 17:49:27 crc kubenswrapper[4762]: E0217 17:49:27.609061 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-k2xfd_openshift-multus(d0f706d4-18a1-44c0-8913-b46af7876ee7)\"" pod="openshift-multus/multus-k2xfd" podUID="d0f706d4-18a1-44c0-8913-b46af7876ee7" Feb 17 17:49:27 crc kubenswrapper[4762]: I0217 17:49:27.637257 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2g774" podStartSLOduration=95.637230168 podStartE2EDuration="1m35.637230168s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:17.596189202 +0000 UTC m=+109.241107212" watchObservedRunningTime="2026-02-17 17:49:27.637230168 +0000 UTC m=+119.282148218" Feb 17 17:49:28 crc kubenswrapper[4762]: I0217 17:49:28.613075 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/1.log" Feb 17 17:49:29 crc kubenswrapper[4762]: I0217 17:49:29.035109 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:29 crc kubenswrapper[4762]: I0217 17:49:29.035440 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:29 crc kubenswrapper[4762]: I0217 17:49:29.035364 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:29 crc kubenswrapper[4762]: I0217 17:49:29.035207 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.036078 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.036379 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.036595 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.036743 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.075471 4762 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 17:49:29 crc kubenswrapper[4762]: E0217 17:49:29.139190 4762 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.036140 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.622283 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/3.log" Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.624528 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerStarted","Data":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.624866 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.652498 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podStartSLOduration=97.652482527 podStartE2EDuration="1m37.652482527s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:30.651144118 +0000 UTC m=+122.296062128" watchObservedRunningTime="2026-02-17 17:49:30.652482527 +0000 UTC m=+122.297400537" Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.838955 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdzt7"] Feb 17 17:49:30 crc kubenswrapper[4762]: I0217 17:49:30.839119 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:30 crc kubenswrapper[4762]: E0217 17:49:30.839535 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:31 crc kubenswrapper[4762]: I0217 17:49:31.035324 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:31 crc kubenswrapper[4762]: I0217 17:49:31.035386 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:31 crc kubenswrapper[4762]: I0217 17:49:31.035474 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:31 crc kubenswrapper[4762]: E0217 17:49:31.035853 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:31 crc kubenswrapper[4762]: E0217 17:49:31.035951 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:31 crc kubenswrapper[4762]: E0217 17:49:31.036048 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:33 crc kubenswrapper[4762]: I0217 17:49:33.035832 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:33 crc kubenswrapper[4762]: I0217 17:49:33.035841 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:33 crc kubenswrapper[4762]: I0217 17:49:33.035995 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:33 crc kubenswrapper[4762]: I0217 17:49:33.036111 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:33 crc kubenswrapper[4762]: E0217 17:49:33.036261 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:33 crc kubenswrapper[4762]: E0217 17:49:33.036374 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:33 crc kubenswrapper[4762]: E0217 17:49:33.036425 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:33 crc kubenswrapper[4762]: E0217 17:49:33.036473 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:34 crc kubenswrapper[4762]: E0217 17:49:34.140850 4762 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:49:35 crc kubenswrapper[4762]: I0217 17:49:35.035170 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:35 crc kubenswrapper[4762]: I0217 17:49:35.035207 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:35 crc kubenswrapper[4762]: I0217 17:49:35.035315 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:35 crc kubenswrapper[4762]: E0217 17:49:35.035317 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:35 crc kubenswrapper[4762]: I0217 17:49:35.035340 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:35 crc kubenswrapper[4762]: E0217 17:49:35.035426 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:35 crc kubenswrapper[4762]: E0217 17:49:35.035549 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:35 crc kubenswrapper[4762]: E0217 17:49:35.035574 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:36 crc kubenswrapper[4762]: I0217 17:49:36.447487 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:49:37 crc kubenswrapper[4762]: I0217 17:49:37.035827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:37 crc kubenswrapper[4762]: I0217 17:49:37.035827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:37 crc kubenswrapper[4762]: I0217 17:49:37.035819 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:37 crc kubenswrapper[4762]: I0217 17:49:37.035874 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:37 crc kubenswrapper[4762]: E0217 17:49:37.036375 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:37 crc kubenswrapper[4762]: E0217 17:49:37.036456 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:37 crc kubenswrapper[4762]: E0217 17:49:37.036524 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:37 crc kubenswrapper[4762]: E0217 17:49:37.036634 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.035743 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.035892 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.035904 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.036250 4762 scope.go:117] "RemoveContainer" containerID="88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0" Feb 17 17:49:39 crc kubenswrapper[4762]: E0217 17:49:39.037345 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.037475 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:39 crc kubenswrapper[4762]: E0217 17:49:39.037898 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:39 crc kubenswrapper[4762]: E0217 17:49:39.038314 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:39 crc kubenswrapper[4762]: E0217 17:49:39.038811 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:39 crc kubenswrapper[4762]: E0217 17:49:39.141541 4762 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.658349 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/1.log" Feb 17 17:49:39 crc kubenswrapper[4762]: I0217 17:49:39.658436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerStarted","Data":"e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e"} Feb 17 17:49:41 crc kubenswrapper[4762]: I0217 17:49:41.035667 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:41 crc kubenswrapper[4762]: I0217 17:49:41.035708 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:41 crc kubenswrapper[4762]: E0217 17:49:41.035823 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:41 crc kubenswrapper[4762]: I0217 17:49:41.035917 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:41 crc kubenswrapper[4762]: E0217 17:49:41.036047 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:41 crc kubenswrapper[4762]: E0217 17:49:41.036181 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:41 crc kubenswrapper[4762]: I0217 17:49:41.036504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:41 crc kubenswrapper[4762]: E0217 17:49:41.036897 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:43 crc kubenswrapper[4762]: I0217 17:49:43.035452 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:43 crc kubenswrapper[4762]: I0217 17:49:43.035488 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:43 crc kubenswrapper[4762]: I0217 17:49:43.035502 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:43 crc kubenswrapper[4762]: I0217 17:49:43.035467 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:43 crc kubenswrapper[4762]: E0217 17:49:43.035575 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:49:43 crc kubenswrapper[4762]: E0217 17:49:43.035764 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:49:43 crc kubenswrapper[4762]: E0217 17:49:43.035755 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:49:43 crc kubenswrapper[4762]: E0217 17:49:43.035828 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdzt7" podUID="6bb87d75-4230-44b9-8ee8-7aff6d051904" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.035962 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.036012 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.036013 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.036162 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.038656 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.038926 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.039218 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.039426 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.039708 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 17:49:45 crc kubenswrapper[4762]: I0217 17:49:45.041386 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.901978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.941110 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.941494 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.943957 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smpx4"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.944412 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.945943 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.946497 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.946832 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.947373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.949747 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950137 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950270 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950427 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950446 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950480 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.950504 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.951756 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.955673 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.959861 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.959864 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.960032 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.962004 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.963935 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nsnbr"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.963960 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.965162 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.965615 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.970672 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.971384 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.972788 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.973451 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.973639 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.973793 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.974910 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.982163 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.991536 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.991687 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.992025 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.992536 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.992846 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.993125 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.993603 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.993936 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.994120 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.994579 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fgqx5"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.994939 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.995111 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.995589 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-clnl8"] Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.996055 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.996477 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:46 crc kubenswrapper[4762]: I0217 17:49:46.997748 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:46.999682 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zfmsb"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.000172 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.000750 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.001330 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.002544 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9bp9t"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.004586 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7lmj7"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.005453 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.006429 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.007425 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.007550 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-config\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009584 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57hkg\" (UniqueName: \"kubernetes.io/projected/f4416087-9030-4283-9d76-ea247185026e-kube-api-access-57hkg\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-service-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-config\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009892 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.009976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac005ed9-eab0-4e8a-952d-45e6695640ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010074 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit-dir\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010188 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2h5\" (UniqueName: \"kubernetes.io/projected/273b9986-2821-4038-809b-3ecc7730baca-kube-api-access-4c2h5\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010287 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-serving-cert\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010380 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tprpj\" (UniqueName: \"kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-serving-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010815 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.010910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7f4\" (UniqueName: \"kubernetes.io/projected/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-kube-api-access-2c7f4\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxfs\" (UniqueName: \"kubernetes.io/projected/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-kube-api-access-msxfs\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011098 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-encryption-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011194 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273b9986-2821-4038-809b-3ecc7730baca-serving-cert\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011284 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac005ed9-eab0-4e8a-952d-45e6695640ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011469 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjchk\" (UniqueName: \"kubernetes.io/projected/ac005ed9-eab0-4e8a-952d-45e6695640ca-kube-api-access-xjchk\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011563 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52b22\" (UniqueName: \"kubernetes.io/projected/27402239-9191-42d8-89b6-8c0e12e54497-kube-api-access-52b22\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011787 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-images\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27402239-9191-42d8-89b6-8c0e12e54497-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-image-import-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012040 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012288 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012351 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-config\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012508 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-trusted-ca\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.008641 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.012641 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-node-pullsecrets\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011481 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.011571 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013076 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013537 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013582 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5r5v9"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013751 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013887 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.013965 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014011 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014064 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014181 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014253 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014312 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014443 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014608 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014723 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014731 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014808 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014881 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014328 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.014332 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015078 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015238 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015260 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015316 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015132 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015579 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.015929 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.016687 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.016824 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.016972 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.018279 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.018308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4416087-9030-4283-9d76-ea247185026e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.018325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6zv\" (UniqueName: \"kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.018348 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-client\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.018363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-serving-cert\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.020764 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.020927 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.021059 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.021961 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022129 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022176 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022347 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022373 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022690 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022761 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022763 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022797 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022776 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.022647 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.039260 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.040783 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.041927 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.042079 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.042882 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.049400 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.052805 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.053280 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.053462 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.053745 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.053757 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.053969 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.054194 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.054348 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.054765 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.054849 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.054928 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.055876 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.056013 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.056146 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.056938 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.057133 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.057498 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.059088 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.061980 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.063417 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.063596 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.063887 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.064773 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.066513 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.066316 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.071265 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.071312 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.073168 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.074752 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pvxtx"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.075553 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.076967 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.090595 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.091161 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.097846 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.098682 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.098939 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.099851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.108332 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8p4kj"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.109282 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.111565 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.112219 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.113986 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.118281 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.118867 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120042 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120079 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7f4\" (UniqueName: \"kubernetes.io/projected/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-kube-api-access-2c7f4\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120120 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxfs\" (UniqueName: \"kubernetes.io/projected/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-kube-api-access-msxfs\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120140 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-client\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120155 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-serving-cert\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120171 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6n9t\" (UniqueName: \"kubernetes.io/projected/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-kube-api-access-q6n9t\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-encryption-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120206 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273b9986-2821-4038-809b-3ecc7730baca-serving-cert\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120222 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120241 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac005ed9-eab0-4e8a-952d-45e6695640ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120257 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwhq\" (UniqueName: \"kubernetes.io/projected/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-kube-api-access-vkwhq\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120273 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120288 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-policies\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120303 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-encryption-config\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120320 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjchk\" (UniqueName: \"kubernetes.io/projected/ac005ed9-eab0-4e8a-952d-45e6695640ca-kube-api-access-xjchk\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120336 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52b22\" (UniqueName: \"kubernetes.io/projected/27402239-9191-42d8-89b6-8c0e12e54497-kube-api-access-52b22\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120354 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89pb\" (UniqueName: \"kubernetes.io/projected/48ea904c-39ba-449b-bb94-2aa5a0821e9c-kube-api-access-c89pb\") pod \"migrator-59844c95c7-8v48w\" (UID: \"48ea904c-39ba-449b-bb94-2aa5a0821e9c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-images\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120411 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120427 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3122cdf-f24a-434e-a9f5-49b561090de6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120459 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27402239-9191-42d8-89b6-8c0e12e54497-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120473 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-machine-approver-tls\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120488 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-service-ca\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-oauth-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120519 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-image-import-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120536 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-auth-proxy-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120554 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdt9t\" (UniqueName: \"kubernetes.io/projected/60395c5c-944a-4aa8-a01d-c8619c2295ad-kube-api-access-vdt9t\") pod \"downloads-7954f5f757-5r5v9\" (UID: \"60395c5c-944a-4aa8-a01d-c8619c2295ad\") " pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120588 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120603 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-trusted-ca-bundle\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120647 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ml7\" (UniqueName: \"kubernetes.io/projected/8a02f5c2-7bfb-405b-829e-0b284148e255-kube-api-access-l7ml7\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120710 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120736 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-config\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120895 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b012c2c-f737-4c39-99de-e2d747b395d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3122cdf-f24a-434e-a9f5-49b561090de6-config\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120972 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.120993 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-dir\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121026 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-trusted-ca\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121051 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-oauth-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121083 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-node-pullsecrets\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121107 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4416087-9030-4283-9d76-ea247185026e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121156 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121202 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121223 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9x92\" (UniqueName: \"kubernetes.io/projected/ddab6d46-4abb-415c-a416-e8131610b68d-kube-api-access-m9x92\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121249 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-client\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121270 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-serving-cert\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121292 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6zv\" (UniqueName: \"kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121315 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmjw4\" (UniqueName: \"kubernetes.io/projected/3bcd89b8-e038-4635-b0e3-f4b45607811b-kube-api-access-bmjw4\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121336 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3122cdf-f24a-434e-a9f5-49b561090de6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121358 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121398 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b012c2c-f737-4c39-99de-e2d747b395d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121417 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc569\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-kube-api-access-gc569\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121454 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a02f5c2-7bfb-405b-829e-0b284148e255-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121466 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121475 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-metrics-certs\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-config\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121526 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121534 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121587 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-config\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121616 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57hkg\" (UniqueName: \"kubernetes.io/projected/f4416087-9030-4283-9d76-ea247185026e-kube-api-access-57hkg\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121696 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-default-certificate\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121722 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121761 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a02f5c2-7bfb-405b-829e-0b284148e255-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121785 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121809 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-images\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwgh\" (UniqueName: \"kubernetes.io/projected/0b012c2c-f737-4c39-99de-e2d747b395d0-kube-api-access-vkwgh\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121886 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-config\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121912 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-service-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121934 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbph\" (UniqueName: \"kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2dz\" (UniqueName: \"kubernetes.io/projected/c7f82eed-54cf-4b40-b996-e23d502a4f9e-kube-api-access-km2dz\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.121982 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122005 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac005ed9-eab0-4e8a-952d-45e6695640ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122030 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122054 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wxz\" (UniqueName: \"kubernetes.io/projected/86c83b85-567c-43f9-ac88-e332e05bea98-kube-api-access-s7wxz\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122078 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122100 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bcd89b8-e038-4635-b0e3-f4b45607811b-metrics-tls\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122136 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit-dir\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122163 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2h5\" (UniqueName: \"kubernetes.io/projected/273b9986-2821-4038-809b-3ecc7730baca-kube-api-access-4c2h5\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122187 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-serving-cert\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122240 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122259 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddab6d46-4abb-415c-a416-e8131610b68d-proxy-tls\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122282 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27402239-9191-42d8-89b6-8c0e12e54497-images\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122322 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tprpj\" (UniqueName: \"kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122350 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-stats-auth\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122377 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-serving-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c83b85-567c-43f9-ac88-e332e05bea98-service-ca-bundle\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122432 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.122486 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.123128 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.123427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-config\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.123652 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.124914 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-service-ca-bundle\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.124964 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-config\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.125425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.125610 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-serving-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.128140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.128517 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.128736 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac005ed9-eab0-4e8a-952d-45e6695640ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.129967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.130076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.130182 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-audit-dir\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.130218 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.131116 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-image-import-ca\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.131194 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-node-pullsecrets\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.131815 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/273b9986-2821-4038-809b-3ecc7730baca-trusted-ca\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.132241 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4416087-9030-4283-9d76-ea247185026e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.132341 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.133174 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27402239-9191-42d8-89b6-8c0e12e54497-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.133279 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-etcd-client\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.133311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.135196 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-serving-cert\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.135409 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-serving-cert\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.135939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-encryption-config\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.142144 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac005ed9-eab0-4e8a-952d-45e6695640ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.144117 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.148043 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.149198 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.149897 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.149924 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.150336 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.150799 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.150926 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.151012 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.151068 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.151661 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.151757 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.152940 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.153219 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.153872 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.153965 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.154848 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.156162 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.156590 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.158421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273b9986-2821-4038-809b-3ecc7730baca-serving-cert\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.160706 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z554"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.161650 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.166574 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.168668 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l8z85"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.171773 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.176824 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.177353 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.177431 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.178078 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.179196 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.180494 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smpx4"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.181595 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.182682 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5r5v9"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.183316 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.183868 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fwzcf"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.185275 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.185352 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.185984 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9bp9t"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.191749 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7lmj7"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.197787 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.198975 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8p4kj"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.201274 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nsnbr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.203356 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fgqx5"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.203514 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.207094 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.208260 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.209702 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.211202 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.213165 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.214774 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.215863 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-clnl8"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.216993 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.218234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.219737 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.221100 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zfmsb"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.222162 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223232 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223398 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223437 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223463 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwhq\" (UniqueName: \"kubernetes.io/projected/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-kube-api-access-vkwhq\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-policies\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223518 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-encryption-config\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89pb\" (UniqueName: \"kubernetes.io/projected/48ea904c-39ba-449b-bb94-2aa5a0821e9c-kube-api-access-c89pb\") pod \"migrator-59844c95c7-8v48w\" (UID: \"48ea904c-39ba-449b-bb94-2aa5a0821e9c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223612 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223656 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3122cdf-f24a-434e-a9f5-49b561090de6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223702 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-machine-approver-tls\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-service-ca\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223742 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-oauth-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-auth-proxy-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223785 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdt9t\" (UniqueName: \"kubernetes.io/projected/60395c5c-944a-4aa8-a01d-c8619c2295ad-kube-api-access-vdt9t\") pod \"downloads-7954f5f757-5r5v9\" (UID: \"60395c5c-944a-4aa8-a01d-c8619c2295ad\") " pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223833 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-trusted-ca-bundle\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ml7\" (UniqueName: \"kubernetes.io/projected/8a02f5c2-7bfb-405b-829e-0b284148e255-kube-api-access-l7ml7\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223930 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223963 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.223991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b012c2c-f737-4c39-99de-e2d747b395d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3122cdf-f24a-434e-a9f5-49b561090de6-config\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224061 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224080 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-dir\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-oauth-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vhnw\" (UniqueName: \"kubernetes.io/projected/084cdb6a-4e10-40fd-b651-d628bc556172-kube-api-access-6vhnw\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224155 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224177 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224198 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224215 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9x92\" (UniqueName: \"kubernetes.io/projected/ddab6d46-4abb-415c-a416-e8131610b68d-kube-api-access-m9x92\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjw4\" (UniqueName: \"kubernetes.io/projected/3bcd89b8-e038-4635-b0e3-f4b45607811b-kube-api-access-bmjw4\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224253 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3122cdf-f24a-434e-a9f5-49b561090de6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224273 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wkkv\" (UniqueName: \"kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224291 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224305 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224322 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b012c2c-f737-4c39-99de-e2d747b395d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224341 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc569\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-kube-api-access-gc569\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224369 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a02f5c2-7bfb-405b-829e-0b284148e255-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224395 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-metrics-certs\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224486 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-default-certificate\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224538 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224561 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a02f5c2-7bfb-405b-829e-0b284148e255-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224579 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224596 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-images\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224613 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwgh\" (UniqueName: \"kubernetes.io/projected/0b012c2c-f737-4c39-99de-e2d747b395d0-kube-api-access-vkwgh\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224647 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224664 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbph\" (UniqueName: \"kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224681 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2dz\" (UniqueName: \"kubernetes.io/projected/c7f82eed-54cf-4b40-b996-e23d502a4f9e-kube-api-access-km2dz\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224699 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224714 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wxz\" (UniqueName: \"kubernetes.io/projected/86c83b85-567c-43f9-ac88-e332e05bea98-kube-api-access-s7wxz\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bcd89b8-e038-4635-b0e3-f4b45607811b-metrics-tls\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224774 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/084cdb6a-4e10-40fd-b651-d628bc556172-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddab6d46-4abb-415c-a416-e8131610b68d-proxy-tls\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224879 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-stats-auth\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224896 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c83b85-567c-43f9-ac88-e332e05bea98-service-ca-bundle\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224914 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224940 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-client\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-serving-cert\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.224975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6n9t\" (UniqueName: \"kubernetes.io/projected/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-kube-api-access-q6n9t\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.225381 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.225413 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-oauth-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.225425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.225561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-dir\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226151 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-service-ca\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226328 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3122cdf-f24a-434e-a9f5-49b561090de6-config\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.226650 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.227285 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-trusted-ca-bundle\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.227402 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.227564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-auth-proxy-config\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.228228 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.228333 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.228470 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.228886 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-encryption-config\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.229047 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-serving-cert\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.229227 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.229552 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.229659 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b012c2c-f737-4c39-99de-e2d747b395d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.230109 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-machine-approver-tls\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.230372 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.230647 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.231708 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bcd89b8-e038-4635-b0e3-f4b45607811b-metrics-tls\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.232255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-audit-policies\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.232484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.232939 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.234103 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3122cdf-f24a-434e-a9f5-49b561090de6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.234147 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l8z85"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.234750 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-serving-cert\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235117 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b012c2c-f737-4c39-99de-e2d747b395d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235196 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-etcd-client\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.235986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.236088 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7f82eed-54cf-4b40-b996-e23d502a4f9e-console-oauth-config\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.236248 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.236346 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.236392 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z554"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.237546 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.238291 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a02f5c2-7bfb-405b-829e-0b284148e255-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.238511 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.239455 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fwzcf"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.240391 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8cnsj"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.241058 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.241396 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l7mfh"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.243178 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cnsj"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.243321 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.243333 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.244223 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.244429 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.244844 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.244894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.245962 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l7mfh"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.246718 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.247488 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8dtsm"] Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.248237 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.248357 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a02f5c2-7bfb-405b-829e-0b284148e255-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.290092 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.295452 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.304203 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.323660 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.325832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/084cdb6a-4e10-40fd-b651-d628bc556172-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.325942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.326041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.326248 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vhnw\" (UniqueName: \"kubernetes.io/projected/084cdb6a-4e10-40fd-b651-d628bc556172-kube-api-access-6vhnw\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.326319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wkkv\" (UniqueName: \"kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.329037 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.345525 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.364031 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.369296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-metrics-certs\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.383611 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.389426 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-default-certificate\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.403672 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.408824 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/86c83b85-567c-43f9-ac88-e332e05bea98-stats-auth\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.424152 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.427152 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c83b85-567c-43f9-ac88-e332e05bea98-service-ca-bundle\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.444303 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.464027 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.483316 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.504082 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.510320 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddab6d46-4abb-415c-a416-e8131610b68d-proxy-tls\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.523617 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.527561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ddab6d46-4abb-415c-a416-e8131610b68d-images\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.543711 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.564297 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.584057 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.604337 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.624768 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.629752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/084cdb6a-4e10-40fd-b651-d628bc556172-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.644654 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.663477 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.684193 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.703973 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.723523 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.744231 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.764872 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.784710 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.823569 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7f4\" (UniqueName: \"kubernetes.io/projected/d2fcbe2b-49c4-450c-afaa-16668ee4e44a-kube-api-access-2c7f4\") pod \"authentication-operator-69f744f599-clnl8\" (UID: \"d2fcbe2b-49c4-450c-afaa-16668ee4e44a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.842023 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxfs\" (UniqueName: \"kubernetes.io/projected/8b52d7ad-b700-4bdb-87bb-94a66d8aaac2-kube-api-access-msxfs\") pod \"apiserver-76f77b778f-nsnbr\" (UID: \"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2\") " pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.861658 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjchk\" (UniqueName: \"kubernetes.io/projected/ac005ed9-eab0-4e8a-952d-45e6695640ca-kube-api-access-xjchk\") pod \"openshift-config-operator-7777fb866f-4bfv5\" (UID: \"ac005ed9-eab0-4e8a-952d-45e6695640ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.888688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52b22\" (UniqueName: \"kubernetes.io/projected/27402239-9191-42d8-89b6-8c0e12e54497-kube-api-access-52b22\") pod \"machine-api-operator-5694c8668f-smpx4\" (UID: \"27402239-9191-42d8-89b6-8c0e12e54497\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.900203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57hkg\" (UniqueName: \"kubernetes.io/projected/f4416087-9030-4283-9d76-ea247185026e-kube-api-access-57hkg\") pod \"cluster-samples-operator-665b6dd947-4g9gr\" (UID: \"f4416087-9030-4283-9d76-ea247185026e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.915501 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.927348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2h5\" (UniqueName: \"kubernetes.io/projected/273b9986-2821-4038-809b-3ecc7730baca-kube-api-access-4c2h5\") pod \"console-operator-58897d9998-fgqx5\" (UID: \"273b9986-2821-4038-809b-3ecc7730baca\") " pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.944405 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tprpj\" (UniqueName: \"kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj\") pod \"route-controller-manager-6576b87f9c-gjb9r\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.948222 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.964277 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6zv\" (UniqueName: \"kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv\") pod \"controller-manager-879f6c89f-htp99\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.964616 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.968732 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.979410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" Feb 17 17:49:47 crc kubenswrapper[4762]: I0217 17:49:47.984869 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.003736 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.004243 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.024810 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.044864 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.065239 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.085106 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.109765 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.125320 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.156116 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.163102 4762 request.go:700] Waited for 1.010787757s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.164704 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.164861 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.167929 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.170838 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.183881 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.195331 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.199236 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.204072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.226028 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.244461 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.245539 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nsnbr"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.263885 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-clnl8"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.264152 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: W0217 17:49:48.268758 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b52d7ad_b700_4bdb_87bb_94a66d8aaac2.slice/crio-3aeb8c210005cf8ee644e245e6c4028736da874832cf3e765efde1c9ad1eff25 WatchSource:0}: Error finding container 3aeb8c210005cf8ee644e245e6c4028736da874832cf3e765efde1c9ad1eff25: Status 404 returned error can't find the container with id 3aeb8c210005cf8ee644e245e6c4028736da874832cf3e765efde1c9ad1eff25 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.286265 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.304453 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.325041 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: E0217 17:49:48.326394 4762 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 17:49:48 crc kubenswrapper[4762]: E0217 17:49:48.326451 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume podName:5e14e621-40b7-4585-b793-dfd0337aec04 nodeName:}" failed. No retries permitted until 2026-02-17 17:49:48.826434457 +0000 UTC m=+140.471352467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume") pod "collect-profiles-29522505-kdv7g" (UID: "5e14e621-40b7-4585-b793-dfd0337aec04") : failed to sync configmap cache: timed out waiting for the condition Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.333399 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.344399 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.364177 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.384012 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.389094 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:49:48 crc kubenswrapper[4762]: W0217 17:49:48.396276 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c1453fe_730e_49d9_9d85_efbfec1ca329.slice/crio-8ccaee0ec0074f7c7137a012d5b5a56f01aad1f3f5e5f69266d2323616ea189e WatchSource:0}: Error finding container 8ccaee0ec0074f7c7137a012d5b5a56f01aad1f3f5e5f69266d2323616ea189e: Status 404 returned error can't find the container with id 8ccaee0ec0074f7c7137a012d5b5a56f01aad1f3f5e5f69266d2323616ea189e Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.403258 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.424235 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.427470 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.444090 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.453840 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fgqx5"] Feb 17 17:49:48 crc kubenswrapper[4762]: W0217 17:49:48.461981 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d07fe7f_b9d0_4e9d_ab69_bafb51ae62ce.slice/crio-028478e88fce412c248fafaf5105889871e25671eee4c5f15c8e01aca3a96177 WatchSource:0}: Error finding container 028478e88fce412c248fafaf5105889871e25671eee4c5f15c8e01aca3a96177: Status 404 returned error can't find the container with id 028478e88fce412c248fafaf5105889871e25671eee4c5f15c8e01aca3a96177 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.463739 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 17:49:48 crc kubenswrapper[4762]: W0217 17:49:48.476880 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273b9986_2821_4038_809b_3ecc7730baca.slice/crio-26f461a1f8b0d6747158e6f99e192e5426ee407ca75bc1ef26799007df00a792 WatchSource:0}: Error finding container 26f461a1f8b0d6747158e6f99e192e5426ee407ca75bc1ef26799007df00a792: Status 404 returned error can't find the container with id 26f461a1f8b0d6747158e6f99e192e5426ee407ca75bc1ef26799007df00a792 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.483486 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.503400 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.530139 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.545515 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.567703 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.574056 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smpx4"] Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.583287 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.604381 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: W0217 17:49:48.612123 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27402239_9191_42d8_89b6_8c0e12e54497.slice/crio-5d26b493757447bad71f3af1645855dc2ef4031d554f6c8defb38aa84e155fd2 WatchSource:0}: Error finding container 5d26b493757447bad71f3af1645855dc2ef4031d554f6c8defb38aa84e155fd2: Status 404 returned error can't find the container with id 5d26b493757447bad71f3af1645855dc2ef4031d554f6c8defb38aa84e155fd2 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.624913 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.644065 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.664032 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.683798 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.690492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" event={"ID":"27402239-9191-42d8-89b6-8c0e12e54497","Type":"ContainerStarted","Data":"5d26b493757447bad71f3af1645855dc2ef4031d554f6c8defb38aa84e155fd2"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.691995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" event={"ID":"3c1453fe-730e-49d9-9d85-efbfec1ca329","Type":"ContainerStarted","Data":"e2e2e1ea687f9294ba50a795fce72bcee9ed632d5eb3a3a75cee69e2e7cf01b2"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.692028 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" event={"ID":"3c1453fe-730e-49d9-9d85-efbfec1ca329","Type":"ContainerStarted","Data":"8ccaee0ec0074f7c7137a012d5b5a56f01aad1f3f5e5f69266d2323616ea189e"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.692241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.693338 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" event={"ID":"d2fcbe2b-49c4-450c-afaa-16668ee4e44a","Type":"ContainerStarted","Data":"83bc4ef0bf63066faee04ec8605a5d339c729a3258b795854517d1cd08a45f9c"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.693477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" event={"ID":"d2fcbe2b-49c4-450c-afaa-16668ee4e44a","Type":"ContainerStarted","Data":"d88ae38da26d85f74f081b524ba9032448c5adbc71f03e796fc8932d7e31bfcb"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.694458 4762 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gjb9r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.694570 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.694902 4762 generic.go:334] "Generic (PLEG): container finished" podID="8b52d7ad-b700-4bdb-87bb-94a66d8aaac2" containerID="5067083fa826d0f9c51fe36b1a8a061139f690fe3b0d941cac11b134c654e3e7" exitCode=0 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.694991 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" event={"ID":"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2","Type":"ContainerDied","Data":"5067083fa826d0f9c51fe36b1a8a061139f690fe3b0d941cac11b134c654e3e7"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.695057 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" event={"ID":"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2","Type":"ContainerStarted","Data":"3aeb8c210005cf8ee644e245e6c4028736da874832cf3e765efde1c9ad1eff25"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.697270 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" event={"ID":"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce","Type":"ContainerStarted","Data":"53a35ab7f66de6b8572565f9321b93045288c8590d4ef842fcb0ad576519eaf2"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.697359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" event={"ID":"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce","Type":"ContainerStarted","Data":"028478e88fce412c248fafaf5105889871e25671eee4c5f15c8e01aca3a96177"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.698080 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.705802 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.707409 4762 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-htp99 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.707517 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.726832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" event={"ID":"273b9986-2821-4038-809b-3ecc7730baca","Type":"ContainerStarted","Data":"9430f976db377096770c9e54233dc0599b1a28d774aefaeafdda271edca9ca73"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.726984 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.727065 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" event={"ID":"273b9986-2821-4038-809b-3ecc7730baca","Type":"ContainerStarted","Data":"26f461a1f8b0d6747158e6f99e192e5426ee407ca75bc1ef26799007df00a792"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.730851 4762 patch_prober.go:28] interesting pod/console-operator-58897d9998-fgqx5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.731118 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" podUID="273b9986-2821-4038-809b-3ecc7730baca" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.731185 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.747960 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.748360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" event={"ID":"f4416087-9030-4283-9d76-ea247185026e","Type":"ContainerStarted","Data":"1284e8259f128bfb236eb7d46a8b0d5dd459d584d760a24d9af4066b23ff6b6f"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.748403 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" event={"ID":"f4416087-9030-4283-9d76-ea247185026e","Type":"ContainerStarted","Data":"9a886863ffaaff8cebc573328839d818e842a92a831aac4d3c018d6268ab923f"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.748424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" event={"ID":"f4416087-9030-4283-9d76-ea247185026e","Type":"ContainerStarted","Data":"2591b8ed8fe2e258e4597f1d1e6d97c3b7372dac9702d0373cb9b85cd2e08f9f"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.759830 4762 generic.go:334] "Generic (PLEG): container finished" podID="ac005ed9-eab0-4e8a-952d-45e6695640ca" containerID="893a15a8722dbb5d2736dbc23d9f225ac20ee2b9bf42c44127ce86deb956fdd7" exitCode=0 Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.759891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" event={"ID":"ac005ed9-eab0-4e8a-952d-45e6695640ca","Type":"ContainerDied","Data":"893a15a8722dbb5d2736dbc23d9f225ac20ee2b9bf42c44127ce86deb956fdd7"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.759919 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" event={"ID":"ac005ed9-eab0-4e8a-952d-45e6695640ca","Type":"ContainerStarted","Data":"ac622afbbaecd4f69761e1b782fed06a991bdce9ee04e02a9ce58f3bef9316a1"} Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.763495 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.785980 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.804079 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.824391 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.845026 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.850083 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.850933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.864194 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.883461 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.903898 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.923500 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.988576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwhq\" (UniqueName: \"kubernetes.io/projected/a2ab5d13-17f8-401b-8b7c-cb95a5e3b498-kube-api-access-vkwhq\") pod \"apiserver-7bbb656c7d-gfq6k\" (UID: \"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:48 crc kubenswrapper[4762]: I0217 17:49:48.998320 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3122cdf-f24a-434e-a9f5-49b561090de6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ftjt5\" (UID: \"f3122cdf-f24a-434e-a9f5-49b561090de6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.017323 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.018425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ml7\" (UniqueName: \"kubernetes.io/projected/8a02f5c2-7bfb-405b-829e-0b284148e255-kube-api-access-l7ml7\") pod \"kube-storage-version-migrator-operator-b67b599dd-s46wz\" (UID: \"8a02f5c2-7bfb-405b-829e-0b284148e255\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.038025 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.057256 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb4785f9-dceb-48d1-8d9a-3f7c24f08c44-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8fkq\" (UID: \"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.079472 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6n9t\" (UniqueName: \"kubernetes.io/projected/f3d196f2-462b-4413-8cc8-c7c7a1dfa866-kube-api-access-q6n9t\") pod \"machine-approver-56656f9798-4gk69\" (UID: \"f3d196f2-462b-4413-8cc8-c7c7a1dfa866\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.102324 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89pb\" (UniqueName: \"kubernetes.io/projected/48ea904c-39ba-449b-bb94-2aa5a0821e9c-kube-api-access-c89pb\") pod \"migrator-59844c95c7-8v48w\" (UID: \"48ea904c-39ba-449b-bb94-2aa5a0821e9c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.119005 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbph\" (UniqueName: \"kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph\") pod \"oauth-openshift-558db77b4-9bp9t\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.141384 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2dz\" (UniqueName: \"kubernetes.io/projected/c7f82eed-54cf-4b40-b996-e23d502a4f9e-kube-api-access-km2dz\") pod \"console-f9d7485db-zfmsb\" (UID: \"c7f82eed-54cf-4b40-b996-e23d502a4f9e\") " pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.157988 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjw4\" (UniqueName: \"kubernetes.io/projected/3bcd89b8-e038-4635-b0e3-f4b45607811b-kube-api-access-bmjw4\") pod \"dns-operator-744455d44c-7lmj7\" (UID: \"3bcd89b8-e038-4635-b0e3-f4b45607811b\") " pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.181442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdt9t\" (UniqueName: \"kubernetes.io/projected/60395c5c-944a-4aa8-a01d-c8619c2295ad-kube-api-access-vdt9t\") pod \"downloads-7954f5f757-5r5v9\" (UID: \"60395c5c-944a-4aa8-a01d-c8619c2295ad\") " pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.182041 4762 request.go:700] Waited for 1.956190794s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-operator/token Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.195412 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.197901 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9x92\" (UniqueName: \"kubernetes.io/projected/ddab6d46-4abb-415c-a416-e8131610b68d-kube-api-access-m9x92\") pod \"machine-config-operator-74547568cd-tj4jr\" (UID: \"ddab6d46-4abb-415c-a416-e8131610b68d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.217800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.219159 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5"] Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.219535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc569\" (UniqueName: \"kubernetes.io/projected/5ab4aadf-cfd3-40b6-b921-2dc992ef8a75-kube-api-access-gc569\") pod \"cluster-image-registry-operator-dc59b4c8b-xfnk7\" (UID: \"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.228046 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.245602 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wxz\" (UniqueName: \"kubernetes.io/projected/86c83b85-567c-43f9-ac88-e332e05bea98-kube-api-access-s7wxz\") pod \"router-default-5444994796-pvxtx\" (UID: \"86c83b85-567c-43f9-ac88-e332e05bea98\") " pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.262192 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwgh\" (UniqueName: \"kubernetes.io/projected/0b012c2c-f737-4c39-99de-e2d747b395d0-kube-api-access-vkwgh\") pod \"openshift-apiserver-operator-796bbdcf4f-n5wpr\" (UID: \"0b012c2c-f737-4c39-99de-e2d747b395d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.265551 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.274115 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.274183 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.284443 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.285412 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.293444 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.307089 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.308189 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.310438 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.323822 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.325525 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.344737 4762 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.364080 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.369327 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.376047 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.385144 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.386182 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.403821 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.405877 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zfmsb"] Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.425135 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.445751 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.459495 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k"] Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.513269 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vhnw\" (UniqueName: \"kubernetes.io/projected/084cdb6a-4e10-40fd-b651-d628bc556172-kube-api-access-6vhnw\") pod \"multus-admission-controller-857f4d67dd-8p4kj\" (UID: \"084cdb6a-4e10-40fd-b651-d628bc556172\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.522193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wkkv\" (UniqueName: \"kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv\") pod \"collect-profiles-29522505-kdv7g\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.529862 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9bp9t"] Feb 17 17:49:49 crc kubenswrapper[4762]: W0217 17:49:49.555889 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35fb25d5_f8ca_43c5_ae4d_31da698c4780.slice/crio-cc9851817ed4863190d0e316155f0a7e9041b513e10ef5c15e35ffeab066ea7e WatchSource:0}: Error finding container cc9851817ed4863190d0e316155f0a7e9041b513e10ef5c15e35ffeab066ea7e: Status 404 returned error can't find the container with id cc9851817ed4863190d0e316155f0a7e9041b513e10ef5c15e35ffeab066ea7e Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.562616 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/770c3e14-c910-4422-82c5-d6671f4a91ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563151 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed085297-7845-4e38-bd40-80bcf2e1ca15-tmpfs\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563200 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbprj\" (UniqueName: \"kubernetes.io/projected/ed085297-7845-4e38-bd40-80bcf2e1ca15-kube-api-access-nbprj\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563264 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563324 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ef7b70e-3331-4d26-b1ea-c18699b6688a-metrics-tls\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563401 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qr2f\" (UniqueName: \"kubernetes.io/projected/fdc035aa-511f-402d-b235-b5fe70abcfd2-kube-api-access-6qr2f\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-service-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-serving-cert\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563501 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-srv-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-config\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlm4\" (UniqueName: \"kubernetes.io/projected/3b6337db-6800-4222-97ac-c9df1a8aeaec-kube-api-access-crlm4\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563718 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563735 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87l2\" (UniqueName: \"kubernetes.io/projected/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-kube-api-access-t87l2\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-client\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563782 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e049ea-d7d8-4a72-8a0f-753a493bc911-serving-cert\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563842 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5czcp\" (UniqueName: \"kubernetes.io/projected/266896ca-532c-45be-b263-727feed4415f-kube-api-access-5czcp\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp9wd\" (UniqueName: \"kubernetes.io/projected/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-kube-api-access-xp9wd\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563898 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdrn\" (UniqueName: \"kubernetes.io/projected/7ef7b70e-3331-4d26-b1ea-c18699b6688a-kube-api-access-7qdrn\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563913 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd75cbc2-2e9e-4522-abff-eca6f0f29678-config\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563939 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/266896ca-532c-45be-b263-727feed4415f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.563984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ef7b70e-3331-4d26-b1ea-c18699b6688a-config-volume\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564064 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rf7\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-kube-api-access-j2rf7\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564084 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd75cbc2-2e9e-4522-abff-eca6f0f29678-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564101 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fpp6\" (UniqueName: \"kubernetes.io/projected/8e22d9a0-7641-44e3-a07f-d07216f7c07c-kube-api-access-2fpp6\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6q6\" (UniqueName: \"kubernetes.io/projected/770c3e14-c910-4422-82c5-d6671f4a91ea-kube-api-access-wb6q6\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-key\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rtb8\" (UniqueName: \"kubernetes.io/projected/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-kube-api-access-8rtb8\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564664 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vkfp\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564693 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-srv-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564730 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-webhook-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564786 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-cabundle\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564838 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e049ea-d7d8-4a72-8a0f-753a493bc911-config\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6337db-6800-4222-97ac-c9df1a8aeaec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564933 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.564955 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-profile-collector-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565010 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-apiservice-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565041 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/770c3e14-c910-4422-82c5-d6671f4a91ea-proxy-tls\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c10f8307-650e-49cb-a376-3781d37517b1-metrics-tls\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565102 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f8307-650e-49cb-a376-3781d37517b1-trusted-ca\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565117 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd75cbc2-2e9e-4522-abff-eca6f0f29678-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts6nb\" (UniqueName: \"kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565212 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr4pw\" (UniqueName: \"kubernetes.io/projected/b2e049ea-d7d8-4a72-8a0f-753a493bc911-kube-api-access-jr4pw\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6337db-6800-4222-97ac-c9df1a8aeaec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.565302 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.569821 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.069807036 +0000 UTC m=+141.714725036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.587437 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr"] Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667147 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667327 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.667364 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.167337233 +0000 UTC m=+141.812255243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbprj\" (UniqueName: \"kubernetes.io/projected/ed085297-7845-4e38-bd40-80bcf2e1ca15-kube-api-access-nbprj\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667475 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ef7b70e-3331-4d26-b1ea-c18699b6688a-metrics-tls\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667524 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qr2f\" (UniqueName: \"kubernetes.io/projected/fdc035aa-511f-402d-b235-b5fe70abcfd2-kube-api-access-6qr2f\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-service-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667580 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fbf6589-961a-45b8-8b4f-0210b879497c-cert\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667595 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-socket-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667614 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-serving-cert\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667649 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667670 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-srv-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-mountpoint-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667719 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667737 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-config\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlm4\" (UniqueName: \"kubernetes.io/projected/3b6337db-6800-4222-97ac-c9df1a8aeaec-kube-api-access-crlm4\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667825 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-csi-data-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667865 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn884\" (UniqueName: \"kubernetes.io/projected/0d171e82-72d4-4c27-ae71-83e36994e5d8-kube-api-access-vn884\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667930 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87l2\" (UniqueName: \"kubernetes.io/projected/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-kube-api-access-t87l2\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667959 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-client\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.667996 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.668018 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e049ea-d7d8-4a72-8a0f-753a493bc911-serving-cert\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.668042 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5czcp\" (UniqueName: \"kubernetes.io/projected/266896ca-532c-45be-b263-727feed4415f-kube-api-access-5czcp\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669073 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp9wd\" (UniqueName: \"kubernetes.io/projected/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-kube-api-access-xp9wd\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-certs\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669149 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdrn\" (UniqueName: \"kubernetes.io/projected/7ef7b70e-3331-4d26-b1ea-c18699b6688a-kube-api-access-7qdrn\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd75cbc2-2e9e-4522-abff-eca6f0f29678-config\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669204 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/266896ca-532c-45be-b263-727feed4415f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ef7b70e-3331-4d26-b1ea-c18699b6688a-config-volume\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669253 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rf7\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-kube-api-access-j2rf7\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669277 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd75cbc2-2e9e-4522-abff-eca6f0f29678-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669300 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fpp6\" (UniqueName: \"kubernetes.io/projected/8e22d9a0-7641-44e3-a07f-d07216f7c07c-kube-api-access-2fpp6\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669334 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6q6\" (UniqueName: \"kubernetes.io/projected/770c3e14-c910-4422-82c5-d6671f4a91ea-kube-api-access-wb6q6\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669356 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-key\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rtb8\" (UniqueName: \"kubernetes.io/projected/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-kube-api-access-8rtb8\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669424 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkfp\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669447 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-srv-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-node-bootstrap-token\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669512 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-plugins-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-webhook-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669560 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-cabundle\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669583 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xf5\" (UniqueName: \"kubernetes.io/projected/3fbf6589-961a-45b8-8b4f-0210b879497c-kube-api-access-27xf5\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669608 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6337db-6800-4222-97ac-c9df1a8aeaec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669686 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e049ea-d7d8-4a72-8a0f-753a493bc911-config\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669716 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-profile-collector-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-registration-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669813 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-apiservice-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669836 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/770c3e14-c910-4422-82c5-d6671f4a91ea-proxy-tls\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c10f8307-650e-49cb-a376-3781d37517b1-metrics-tls\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669924 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f8307-650e-49cb-a376-3781d37517b1-trusted-ca\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd75cbc2-2e9e-4522-abff-eca6f0f29678-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts6nb\" (UniqueName: \"kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.669996 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4fss\" (UniqueName: \"kubernetes.io/projected/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-kube-api-access-g4fss\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr4pw\" (UniqueName: \"kubernetes.io/projected/b2e049ea-d7d8-4a72-8a0f-753a493bc911-kube-api-access-jr4pw\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670061 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670089 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670113 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6337db-6800-4222-97ac-c9df1a8aeaec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670139 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/770c3e14-c910-4422-82c5-d6671f4a91ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.670188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed085297-7845-4e38-bd40-80bcf2e1ca15-tmpfs\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.672168 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.672294 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd75cbc2-2e9e-4522-abff-eca6f0f29678-config\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.672564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed085297-7845-4e38-bd40-80bcf2e1ca15-tmpfs\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.672919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.673516 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/770c3e14-c910-4422-82c5-d6671f4a91ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.673761 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.173749495 +0000 UTC m=+141.818667495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.674509 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.674919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-config\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.675323 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ef7b70e-3331-4d26-b1ea-c18699b6688a-config-volume\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.675810 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f8307-650e-49cb-a376-3781d37517b1-trusted-ca\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.676396 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e049ea-d7d8-4a72-8a0f-753a493bc911-config\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.680515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-cabundle\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.685241 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6337db-6800-4222-97ac-c9df1a8aeaec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.685708 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-service-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.687289 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.691541 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.692930 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-ca\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.694847 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e049ea-d7d8-4a72-8a0f-753a493bc911-serving-cert\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.695294 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/266896ca-532c-45be-b263-727feed4415f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.697726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.698571 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.699481 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-webhook-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.699812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6337db-6800-4222-97ac-c9df1a8aeaec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.700967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-profile-collector-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.701412 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed085297-7845-4e38-bd40-80bcf2e1ca15-apiservice-cert\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.702474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-serving-cert\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.703155 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/770c3e14-c910-4422-82c5-d6671f4a91ea-proxy-tls\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.703334 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd75cbc2-2e9e-4522-abff-eca6f0f29678-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.710378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ef7b70e-3331-4d26-b1ea-c18699b6688a-metrics-tls\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.710712 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-etcd-client\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.720988 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.728100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e22d9a0-7641-44e3-a07f-d07216f7c07c-signing-key\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.728448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-srv-cert\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.737624 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c10f8307-650e-49cb-a376-3781d37517b1-metrics-tls\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.742545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.744312 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.744506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.750192 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdc035aa-511f-402d-b235-b5fe70abcfd2-srv-cert\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.771692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.771993 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.271975183 +0000 UTC m=+141.916893193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.772311 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-registration-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.772638 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-registration-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.772815 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4fss\" (UniqueName: \"kubernetes.io/projected/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-kube-api-access-g4fss\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.772898 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.773009 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fbf6589-961a-45b8-8b4f-0210b879497c-cert\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.773366 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.273358964 +0000 UTC m=+141.918276974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774123 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-socket-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774248 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-mountpoint-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-csi-data-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774422 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn884\" (UniqueName: \"kubernetes.io/projected/0d171e82-72d4-4c27-ae71-83e36994e5d8-kube-api-access-vn884\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-certs\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-node-bootstrap-token\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774870 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-plugins-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.774942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27xf5\" (UniqueName: \"kubernetes.io/projected/3fbf6589-961a-45b8-8b4f-0210b879497c-kube-api-access-27xf5\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.775331 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-mountpoint-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.775467 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-csi-data-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.775565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-plugins-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.779854 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbprj\" (UniqueName: \"kubernetes.io/projected/ed085297-7845-4e38-bd40-80bcf2e1ca15-kube-api-access-nbprj\") pod \"packageserver-d55dfcdfc-d6szx\" (UID: \"ed085297-7845-4e38-bd40-80bcf2e1ca15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.780484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d171e82-72d4-4c27-ae71-83e36994e5d8-socket-dir\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.781397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fpp6\" (UniqueName: \"kubernetes.io/projected/8e22d9a0-7641-44e3-a07f-d07216f7c07c-kube-api-access-2fpp6\") pod \"service-ca-9c57cc56f-l8z85\" (UID: \"8e22d9a0-7641-44e3-a07f-d07216f7c07c\") " pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.793129 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.800796 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.801278 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" event={"ID":"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2","Type":"ContainerStarted","Data":"9bba14d2923d8a9f389f561d083d7d78b9b92d841028074e32f0fc47c9e50da3"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.802556 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" event={"ID":"8b52d7ad-b700-4bdb-87bb-94a66d8aaac2","Type":"ContainerStarted","Data":"7dbd11723b4705bf7a2faf6852e6def66aedf1cfb44d886ffe7a747f5bfccb27"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.809893 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.811634 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6q6\" (UniqueName: \"kubernetes.io/projected/770c3e14-c910-4422-82c5-d6671f4a91ea-kube-api-access-wb6q6\") pod \"machine-config-controller-84d6567774-2zhrk\" (UID: \"770c3e14-c910-4422-82c5-d6671f4a91ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.814540 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdrn\" (UniqueName: \"kubernetes.io/projected/7ef7b70e-3331-4d26-b1ea-c18699b6688a-kube-api-access-7qdrn\") pod \"dns-default-fwzcf\" (UID: \"7ef7b70e-3331-4d26-b1ea-c18699b6688a\") " pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.814678 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fbf6589-961a-45b8-8b4f-0210b879497c-cert\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.817187 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.833157 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-node-bootstrap-token\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.833176 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rtb8\" (UniqueName: \"kubernetes.io/projected/a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb-kube-api-access-8rtb8\") pod \"catalog-operator-68c6474976-nbm9w\" (UID: \"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.833671 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-certs\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.836608 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5r5v9"] Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.839066 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlm4\" (UniqueName: \"kubernetes.io/projected/3b6337db-6800-4222-97ac-c9df1a8aeaec-kube-api-access-crlm4\") pod \"openshift-controller-manager-operator-756b6f6bc6-l9p7g\" (UID: \"3b6337db-6800-4222-97ac-c9df1a8aeaec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.857406 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vkfp\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.857654 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" event={"ID":"35fb25d5-f8ca-43c5-ae4d-31da698c4780","Type":"ContainerStarted","Data":"cc9851817ed4863190d0e316155f0a7e9041b513e10ef5c15e35ffeab066ea7e"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.868528 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd75cbc2-2e9e-4522-abff-eca6f0f29678-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g7zcx\" (UID: \"bd75cbc2-2e9e-4522-abff-eca6f0f29678\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.881100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfmsb" event={"ID":"c7f82eed-54cf-4b40-b996-e23d502a4f9e","Type":"ContainerStarted","Data":"74d6d51c5047bc72274afe223a6187bf0624268ab0fbf0c4e46434a64dbe5dc0"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.881354 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.882756 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.382729336 +0000 UTC m=+142.027647346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.896109 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" event={"ID":"f3122cdf-f24a-434e-a9f5-49b561090de6","Type":"ContainerStarted","Data":"b9f97f5ff762ebae7003590d19209f5230f4083eb3577c085a34023497852a1b"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.904419 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" event={"ID":"27402239-9191-42d8-89b6-8c0e12e54497","Type":"ContainerStarted","Data":"a2c0c9d6eae076d84673ae3c2568feb260f20d0559a67bbde29e35a5a46a0b22"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.904470 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" event={"ID":"27402239-9191-42d8-89b6-8c0e12e54497","Type":"ContainerStarted","Data":"cc139c5da2046554ef6e5adabcb1cafb3a1f81ba4682657c715e12b937cb05e6"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.922281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87l2\" (UniqueName: \"kubernetes.io/projected/49bda643-ddd8-4dd8-854a-7ec3d0f960ea-kube-api-access-t87l2\") pod \"etcd-operator-b45778765-2z554\" (UID: \"49bda643-ddd8-4dd8-854a-7ec3d0f960ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.931169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr4pw\" (UniqueName: \"kubernetes.io/projected/b2e049ea-d7d8-4a72-8a0f-753a493bc911-kube-api-access-jr4pw\") pod \"service-ca-operator-777779d784-nwmtg\" (UID: \"b2e049ea-d7d8-4a72-8a0f-753a493bc911\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.938172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" event={"ID":"ac005ed9-eab0-4e8a-952d-45e6695640ca","Type":"ContainerStarted","Data":"4480d75f57be2eacb8ee58ae1ad6a7692943b7abfd194bac7f3c3f5fb1277bec"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.938871 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.944046 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" event={"ID":"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498","Type":"ContainerStarted","Data":"327f02b30af29fe3919c1d9cce69c3877504c778ebe501cda3f73cb3a6cfb911"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.957454 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5czcp\" (UniqueName: \"kubernetes.io/projected/266896ca-532c-45be-b263-727feed4415f-kube-api-access-5czcp\") pod \"control-plane-machine-set-operator-78cbb6b69f-4ttdt\" (UID: \"266896ca-532c-45be-b263-727feed4415f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.958714 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" event={"ID":"f3d196f2-462b-4413-8cc8-c7c7a1dfa866","Type":"ContainerStarted","Data":"b22c7825dd3363c086f5492d3c4fe59872ea5637d59b00411499d593dc5ed7b3"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.980785 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" event={"ID":"0b012c2c-f737-4c39-99de-e2d747b395d0","Type":"ContainerStarted","Data":"cdf4bec07095dfaeccc30dc6b0f86a9d8c82914f5e20f77ef5dfc028799d80c1"} Feb 17 17:49:49 crc kubenswrapper[4762]: I0217 17:49:49.983841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:49 crc kubenswrapper[4762]: E0217 17:49:49.986467 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.486451908 +0000 UTC m=+142.131369918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.000227 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp9wd\" (UniqueName: \"kubernetes.io/projected/4d10a1bb-fd22-4e00-9ee5-465663cfa3c8-kube-api-access-xp9wd\") pod \"package-server-manager-789f6589d5-gdlw4\" (UID: \"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.003162 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.009177 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.009214 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.016348 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.022723 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rf7\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-kube-api-access-j2rf7\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.028263 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.036922 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.038370 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10f8307-650e-49cb-a376-3781d37517b1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6x8mb\" (UID: \"c10f8307-650e-49cb-a376-3781d37517b1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.039944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qr2f\" (UniqueName: \"kubernetes.io/projected/fdc035aa-511f-402d-b235-b5fe70abcfd2-kube-api-access-6qr2f\") pod \"olm-operator-6b444d44fb-q2ktl\" (UID: \"fdc035aa-511f-402d-b235-b5fe70abcfd2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.042544 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.070103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.070608 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.083039 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.088589 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.090580 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.590554292 +0000 UTC m=+142.235472302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.106701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts6nb\" (UniqueName: \"kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb\") pod \"marketplace-operator-79b997595-2gktn\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.115246 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4fss\" (UniqueName: \"kubernetes.io/projected/1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7-kube-api-access-g4fss\") pod \"machine-config-server-8dtsm\" (UID: \"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7\") " pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.117251 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn884\" (UniqueName: \"kubernetes.io/projected/0d171e82-72d4-4c27-ae71-83e36994e5d8-kube-api-access-vn884\") pod \"csi-hostpathplugin-l7mfh\" (UID: \"0d171e82-72d4-4c27-ae71-83e36994e5d8\") " pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.123051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xf5\" (UniqueName: \"kubernetes.io/projected/3fbf6589-961a-45b8-8b4f-0210b879497c-kube-api-access-27xf5\") pod \"ingress-canary-8cnsj\" (UID: \"3fbf6589-961a-45b8-8b4f-0210b879497c\") " pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.127093 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cnsj" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.171938 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.178238 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8dtsm" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.190508 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.191081 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.691068468 +0000 UTC m=+142.335986478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.236468 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.293244 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.293627 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.793607735 +0000 UTC m=+142.438525745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.324910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.375957 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.404176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.405070 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:50.905054749 +0000 UTC m=+142.549972769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.508414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.508883 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.008863783 +0000 UTC m=+142.653781803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.616500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.617025 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.117013458 +0000 UTC m=+142.761931468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.682979 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.689148 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-smpx4" podStartSLOduration=117.689128765 podStartE2EDuration="1m57.689128765s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:50.663123937 +0000 UTC m=+142.308041957" watchObservedRunningTime="2026-02-17 17:49:50.689128765 +0000 UTC m=+142.334046775" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.690877 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w"] Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.719212 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.719540 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.219525094 +0000 UTC m=+142.864443104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.729065 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg"] Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.751781 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq"] Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.786209 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-clnl8" podStartSLOduration=118.786190658 podStartE2EDuration="1m58.786190658s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:50.785667103 +0000 UTC m=+142.430585113" watchObservedRunningTime="2026-02-17 17:49:50.786190658 +0000 UTC m=+142.431108668" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.820840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.821321 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.321287778 +0000 UTC m=+142.966205788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.831207 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz"] Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.875111 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7"] Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.899104 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" podStartSLOduration=117.899083295 podStartE2EDuration="1m57.899083295s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:50.898268361 +0000 UTC m=+142.543186371" watchObservedRunningTime="2026-02-17 17:49:50.899083295 +0000 UTC m=+142.544001305" Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.909530 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7lmj7"] Feb 17 17:49:50 crc kubenswrapper[4762]: W0217 17:49:50.910759 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e049ea_d7d8_4a72_8a0f_753a493bc911.slice/crio-fbeb3d87c4c5c9d9ec0f443a324f71aca3c77165b8d2a07cab9a308dc2970dbe WatchSource:0}: Error finding container fbeb3d87c4c5c9d9ec0f443a324f71aca3c77165b8d2a07cab9a308dc2970dbe: Status 404 returned error can't find the container with id fbeb3d87c4c5c9d9ec0f443a324f71aca3c77165b8d2a07cab9a308dc2970dbe Feb 17 17:49:50 crc kubenswrapper[4762]: W0217 17:49:50.910966 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4785f9_dceb_48d1_8d9a_3f7c24f08c44.slice/crio-d979959799540148b828d84f5c5cd3dd29c83f41513a0985c3e34e3c8ae4fd96 WatchSource:0}: Error finding container d979959799540148b828d84f5c5cd3dd29c83f41513a0985c3e34e3c8ae4fd96: Status 404 returned error can't find the container with id d979959799540148b828d84f5c5cd3dd29c83f41513a0985c3e34e3c8ae4fd96 Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.924338 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:50 crc kubenswrapper[4762]: E0217 17:49:50.924770 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.424750133 +0000 UTC m=+143.069668143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.991699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" event={"ID":"f3122cdf-f24a-434e-a9f5-49b561090de6","Type":"ContainerStarted","Data":"266c5084628e62010807ddf03a946cc42d3709e1955e4968fcb8546bc94d5fcb"} Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.993617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" event={"ID":"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44","Type":"ContainerStarted","Data":"d979959799540148b828d84f5c5cd3dd29c83f41513a0985c3e34e3c8ae4fd96"} Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.994737 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8dtsm" event={"ID":"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7","Type":"ContainerStarted","Data":"b66e772e2a723972b4ee054ff3b66039b7f5cca3f6657ed426cf668ca605feeb"} Feb 17 17:49:50 crc kubenswrapper[4762]: I0217 17:49:50.998100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfmsb" event={"ID":"c7f82eed-54cf-4b40-b996-e23d502a4f9e","Type":"ContainerStarted","Data":"545b05ab8cffd8d90be63337ed8c2771ff3cb73966917a5f00f8cff0fe506680"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.000011 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" event={"ID":"48ea904c-39ba-449b-bb94-2aa5a0821e9c","Type":"ContainerStarted","Data":"f7d2d16abf790614df06026753781683fbc915f6cb6788cb0f69ae3f6dfee338"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.001868 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" event={"ID":"b2e049ea-d7d8-4a72-8a0f-753a493bc911","Type":"ContainerStarted","Data":"fbeb3d87c4c5c9d9ec0f443a324f71aca3c77165b8d2a07cab9a308dc2970dbe"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.003826 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pvxtx" event={"ID":"86c83b85-567c-43f9-ac88-e332e05bea98","Type":"ContainerStarted","Data":"6b3e11dd541d7840c400940c69f2493a3838eca3f0130f4dfcdfb7b636bf9554"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.004889 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5r5v9" event={"ID":"60395c5c-944a-4aa8-a01d-c8619c2295ad","Type":"ContainerStarted","Data":"4516cfce32d192b9b1e83ed7982f93d42f8c012892bd6796df7400a3dc933581"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.007526 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" event={"ID":"0b012c2c-f737-4c39-99de-e2d747b395d0","Type":"ContainerStarted","Data":"1b38dabda77bf3cb21a4c8990c6134690fe29279e2938bad26dfdb68f750f6b0"} Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.026089 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.026510 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.526497266 +0000 UTC m=+143.171415286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.128626 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.130554 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.630426364 +0000 UTC m=+143.275344374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.132307 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4g9gr" podStartSLOduration=119.13229131 podStartE2EDuration="1m59.13229131s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.130785975 +0000 UTC m=+142.775703985" watchObservedRunningTime="2026-02-17 17:49:51.13229131 +0000 UTC m=+142.777209320" Feb 17 17:49:51 crc kubenswrapper[4762]: W0217 17:49:51.156790 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a02f5c2_7bfb_405b_829e_0b284148e255.slice/crio-eef253d4fccff97a86007aadea21bb96c2b5bdc1f454605e7664a320f657ef3b WatchSource:0}: Error finding container eef253d4fccff97a86007aadea21bb96c2b5bdc1f454605e7664a320f657ef3b: Status 404 returned error can't find the container with id eef253d4fccff97a86007aadea21bb96c2b5bdc1f454605e7664a320f657ef3b Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.230909 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.232693 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.732638642 +0000 UTC m=+143.377556652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: W0217 17:49:51.232775 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab4aadf_cfd3_40b6_b921_2dc992ef8a75.slice/crio-691d7f93d409e36127d5126400560aaaf4625e1dfedee7acbb7cbfed1e86e81f WatchSource:0}: Error finding container 691d7f93d409e36127d5126400560aaaf4625e1dfedee7acbb7cbfed1e86e81f: Status 404 returned error can't find the container with id 691d7f93d409e36127d5126400560aaaf4625e1dfedee7acbb7cbfed1e86e81f Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.331936 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.332676 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.832641473 +0000 UTC m=+143.477559483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: W0217 17:49:51.332895 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bcd89b8_e038_4635_b0e3_f4b45607811b.slice/crio-7cf44bb47d46df9ea553e7105f9d0604712d401121e0df9a00b72238c1f1c71d WatchSource:0}: Error finding container 7cf44bb47d46df9ea553e7105f9d0604712d401121e0df9a00b72238c1f1c71d: Status 404 returned error can't find the container with id 7cf44bb47d46df9ea553e7105f9d0604712d401121e0df9a00b72238c1f1c71d Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.371292 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" podStartSLOduration=118.371273208 podStartE2EDuration="1m58.371273208s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.367103573 +0000 UTC m=+143.012021593" watchObservedRunningTime="2026-02-17 17:49:51.371273208 +0000 UTC m=+143.016191218" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.433313 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.433687 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:51.933675005 +0000 UTC m=+143.578593015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.524295 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" podStartSLOduration=118.524274494 podStartE2EDuration="1m58.524274494s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.523051478 +0000 UTC m=+143.167969498" watchObservedRunningTime="2026-02-17 17:49:51.524274494 +0000 UTC m=+143.169192504" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.527078 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4bfv5" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.534953 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.535299 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.035281944 +0000 UTC m=+143.680199954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.639839 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fgqx5" podStartSLOduration=118.63982174 podStartE2EDuration="1m58.63982174s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.639503531 +0000 UTC m=+143.284421541" watchObservedRunningTime="2026-02-17 17:49:51.63982174 +0000 UTC m=+143.284739750" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.640011 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.640300 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.140279804 +0000 UTC m=+143.785197824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.749718 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.750135 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.250116558 +0000 UTC m=+143.895034568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.828980 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" podStartSLOduration=119.828962887 podStartE2EDuration="1m59.828962887s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.828798222 +0000 UTC m=+143.473716232" watchObservedRunningTime="2026-02-17 17:49:51.828962887 +0000 UTC m=+143.473880897" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.851344 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.851735 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.351721747 +0000 UTC m=+143.996639757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.917732 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ftjt5" podStartSLOduration=118.917710461 podStartE2EDuration="1m58.917710461s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.915690811 +0000 UTC m=+143.560608821" watchObservedRunningTime="2026-02-17 17:49:51.917710461 +0000 UTC m=+143.562628481" Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.952874 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.953049 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.453024577 +0000 UTC m=+144.097942587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.953196 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:51 crc kubenswrapper[4762]: E0217 17:49:51.953655 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.453615865 +0000 UTC m=+144.098533875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:51 crc kubenswrapper[4762]: I0217 17:49:51.957732 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n5wpr" podStartSLOduration=119.957706927 podStartE2EDuration="1m59.957706927s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:51.954021407 +0000 UTC m=+143.598939417" watchObservedRunningTime="2026-02-17 17:49:51.957706927 +0000 UTC m=+143.602624947" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.014346 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" event={"ID":"48ea904c-39ba-449b-bb94-2aa5a0821e9c","Type":"ContainerStarted","Data":"a1dcd93646222350c2753b92f53c157667a746086d87963f1088073ea9456e24"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.017324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" event={"ID":"8a02f5c2-7bfb-405b-829e-0b284148e255","Type":"ContainerStarted","Data":"99169679c7ebceb4f46e71761d090941948a1205b329bb5f910cbc9d7b3e3dd4"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.017393 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" event={"ID":"8a02f5c2-7bfb-405b-829e-0b284148e255","Type":"ContainerStarted","Data":"eef253d4fccff97a86007aadea21bb96c2b5bdc1f454605e7664a320f657ef3b"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.018832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5r5v9" event={"ID":"60395c5c-944a-4aa8-a01d-c8619c2295ad","Type":"ContainerStarted","Data":"93fc6734d9ac147bf60c63503cef797470d7f0872c85f4745bd0004addabc21d"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.019104 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.024697 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" event={"ID":"3bcd89b8-e038-4635-b0e3-f4b45607811b","Type":"ContainerStarted","Data":"7cf44bb47d46df9ea553e7105f9d0604712d401121e0df9a00b72238c1f1c71d"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.038284 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" event={"ID":"35fb25d5-f8ca-43c5-ae4d-31da698c4780","Type":"ContainerStarted","Data":"7ff0c4858cc2ca577111d16ac1ffb9274d3d7c6b641ecd3b631382229e2e109b"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.040659 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-5r5v9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.040694 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5r5v9" podUID="60395c5c-944a-4aa8-a01d-c8619c2295ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.041088 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.043604 4762 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9bp9t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.043679 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.045298 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" event={"ID":"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75","Type":"ContainerStarted","Data":"691d7f93d409e36127d5126400560aaaf4625e1dfedee7acbb7cbfed1e86e81f"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.048356 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zfmsb" podStartSLOduration=119.048332438 podStartE2EDuration="1m59.048332438s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.04538574 +0000 UTC m=+143.690303750" watchObservedRunningTime="2026-02-17 17:49:52.048332438 +0000 UTC m=+143.693250448" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.052945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pvxtx" event={"ID":"86c83b85-567c-43f9-ac88-e332e05bea98","Type":"ContainerStarted","Data":"d671e286c6ff07193472e9b29cdcaacecce56bee79098bd5161c2ebe08e9df17"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.053975 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.054029 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.554014218 +0000 UTC m=+144.198932228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.055652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.056075 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.556063189 +0000 UTC m=+144.200981209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.067786 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" event={"ID":"b2e049ea-d7d8-4a72-8a0f-753a493bc911","Type":"ContainerStarted","Data":"3293f374038f37b6e1497dbf5435ecf8b403356909afcb22fea02586b6044322"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.079353 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8dtsm" event={"ID":"1cbb45ba-dd12-4ab8-a47a-c5902c96dbf7","Type":"ContainerStarted","Data":"330eb6af8d7664065f87509c0eb1d6aa50b707af3ba080f1452495cb0d3facc9"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.087616 4762 generic.go:334] "Generic (PLEG): container finished" podID="a2ab5d13-17f8-401b-8b7c-cb95a5e3b498" containerID="c0ac1134297a02b250bffe1979276a975db7deaec40b67e598664932bdefa5ec" exitCode=0 Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.088679 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" event={"ID":"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498","Type":"ContainerDied","Data":"c0ac1134297a02b250bffe1979276a975db7deaec40b67e598664932bdefa5ec"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.110859 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" event={"ID":"f3d196f2-462b-4413-8cc8-c7c7a1dfa866","Type":"ContainerStarted","Data":"399cfe405794c0df6ee19a4124f34738326c471790959ced6f74cd5107bcc93f"} Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.159881 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.161430 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.66141307 +0000 UTC m=+144.306331080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.261569 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.282832 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.782815481 +0000 UTC m=+144.427733481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.333174 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l8z85"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.367140 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.367523 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.867506925 +0000 UTC m=+144.512424935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.377450 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.399525 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s46wz" podStartSLOduration=119.397243654 podStartE2EDuration="1m59.397243654s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.396055599 +0000 UTC m=+144.040973629" watchObservedRunningTime="2026-02-17 17:49:52.397243654 +0000 UTC m=+144.042161674" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.399754 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fwzcf"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.399796 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:52 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:52 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:52 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.399842 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.429661 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.468150 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.468457 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:52.968446154 +0000 UTC m=+144.613364164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.484312 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8dtsm" podStartSLOduration=5.484295988 podStartE2EDuration="5.484295988s" podCreationTimestamp="2026-02-17 17:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.443966832 +0000 UTC m=+144.088884842" watchObservedRunningTime="2026-02-17 17:49:52.484295988 +0000 UTC m=+144.129213998" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.486746 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.486930 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.510734 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nwmtg" podStartSLOduration=119.510713128 podStartE2EDuration="1m59.510713128s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.499345528 +0000 UTC m=+144.144263538" watchObservedRunningTime="2026-02-17 17:49:52.510713128 +0000 UTC m=+144.155631138" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.512158 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.520770 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.520836 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.522447 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk"] Feb 17 17:49:52 crc kubenswrapper[4762]: W0217 17:49:52.545637 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded085297_7845_4e38_bd40_80bcf2e1ca15.slice/crio-4dcf19849c823646914d7b6ab65ea1b63eaa5fddc180c04bed9d25ddac298160 WatchSource:0}: Error finding container 4dcf19849c823646914d7b6ab65ea1b63eaa5fddc180c04bed9d25ddac298160: Status 404 returned error can't find the container with id 4dcf19849c823646914d7b6ab65ea1b63eaa5fddc180c04bed9d25ddac298160 Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.554289 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pvxtx" podStartSLOduration=119.554271071 podStartE2EDuration="1m59.554271071s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.529608413 +0000 UTC m=+144.174526423" watchObservedRunningTime="2026-02-17 17:49:52.554271071 +0000 UTC m=+144.199189081" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.557242 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" podStartSLOduration=120.557230729 podStartE2EDuration="2m0.557230729s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.555145747 +0000 UTC m=+144.200063757" watchObservedRunningTime="2026-02-17 17:49:52.557230729 +0000 UTC m=+144.202148739" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.569380 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.569758 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.069727183 +0000 UTC m=+144.714645193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.573631 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5r5v9" podStartSLOduration=119.573609309 podStartE2EDuration="1m59.573609309s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:52.573047402 +0000 UTC m=+144.217965412" watchObservedRunningTime="2026-02-17 17:49:52.573609309 +0000 UTC m=+144.218527319" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.634612 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w"] Feb 17 17:49:52 crc kubenswrapper[4762]: W0217 17:49:52.638672 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod266896ca_532c_45be_b263_727feed4415f.slice/crio-67ed2aa03cd521a2d284852ed807265f40479c114a66fd1028d8defe721f4662 WatchSource:0}: Error finding container 67ed2aa03cd521a2d284852ed807265f40479c114a66fd1028d8defe721f4662: Status 404 returned error can't find the container with id 67ed2aa03cd521a2d284852ed807265f40479c114a66fd1028d8defe721f4662 Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.647425 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.664950 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l7mfh"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.672302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.672611 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.17260021 +0000 UTC m=+144.817518220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.674237 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z554"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.679924 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.704930 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cnsj"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.709245 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8p4kj"] Feb 17 17:49:52 crc kubenswrapper[4762]: W0217 17:49:52.731961 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd75cbc2_2e9e_4522_abff_eca6f0f29678.slice/crio-c7d2230a731ca541399a062580e133cd77948979cca6a5ff2aa47b5ce1503799 WatchSource:0}: Error finding container c7d2230a731ca541399a062580e133cd77948979cca6a5ff2aa47b5ce1503799: Status 404 returned error can't find the container with id c7d2230a731ca541399a062580e133cd77948979cca6a5ff2aa47b5ce1503799 Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.759013 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb"] Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.779783 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.779918 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.279892559 +0000 UTC m=+144.924810569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.780016 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.780386 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.280374704 +0000 UTC m=+144.925292714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.798483 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:49:52 crc kubenswrapper[4762]: W0217 17:49:52.809167 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbf6589_961a_45b8_8b4f_0210b879497c.slice/crio-3be03bb1bd40e585b5c08f80ba58408bb64b1d7627c3978ba1891c71380f3137 WatchSource:0}: Error finding container 3be03bb1bd40e585b5c08f80ba58408bb64b1d7627c3978ba1891c71380f3137: Status 404 returned error can't find the container with id 3be03bb1bd40e585b5c08f80ba58408bb64b1d7627c3978ba1891c71380f3137 Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.882092 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.882396 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.382378954 +0000 UTC m=+145.027296964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.965277 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.967176 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:52 crc kubenswrapper[4762]: I0217 17:49:52.983810 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:52 crc kubenswrapper[4762]: E0217 17:49:52.984227 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.48421584 +0000 UTC m=+145.129133840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.089093 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.090749 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.590727136 +0000 UTC m=+145.235645146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.171604 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" event={"ID":"ed085297-7845-4e38-bd40-80bcf2e1ca15","Type":"ContainerStarted","Data":"4dcf19849c823646914d7b6ab65ea1b63eaa5fddc180c04bed9d25ddac298160"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.178446 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" event={"ID":"c10f8307-650e-49cb-a376-3781d37517b1","Type":"ContainerStarted","Data":"80dfe65bfe851309f54ac2b659be47b3ffbadafb9831c0635019e1e2ba8c0a90"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.192404 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" event={"ID":"266896ca-532c-45be-b263-727feed4415f","Type":"ContainerStarted","Data":"67ed2aa03cd521a2d284852ed807265f40479c114a66fd1028d8defe721f4662"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.194480 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" event={"ID":"5e14e621-40b7-4585-b793-dfd0337aec04","Type":"ContainerStarted","Data":"4bacaf123313e0901db1034a2426dfb71fb078c50a6ef2e66ac69d9e466fb118"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.195682 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cnsj" event={"ID":"3fbf6589-961a-45b8-8b4f-0210b879497c","Type":"ContainerStarted","Data":"3be03bb1bd40e585b5c08f80ba58408bb64b1d7627c3978ba1891c71380f3137"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.198359 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.198770 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.698759267 +0000 UTC m=+145.343677277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.221271 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" event={"ID":"ddab6d46-4abb-415c-a416-e8131610b68d","Type":"ContainerStarted","Data":"75215843a3a9262edf8a4815d295c298596d4307c9c99c3ceea0d92acd20fd58"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.221354 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" event={"ID":"ddab6d46-4abb-415c-a416-e8131610b68d","Type":"ContainerStarted","Data":"66f5a1ca7bce164ffa201315b4d7a1565cecb9008ae7f82518d36b18c9257ffd"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.225085 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" event={"ID":"49bda643-ddd8-4dd8-854a-7ec3d0f960ea","Type":"ContainerStarted","Data":"4b73d2acde65d6a8a10336308639ad56283d5a29cc85b2b284a0f7fd36da4b1b"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.231072 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" event={"ID":"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb","Type":"ContainerStarted","Data":"f270953939648db20442a817ec041611c12d3c84d0a07780112b8667b9dd579b"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.248485 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" event={"ID":"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8","Type":"ContainerStarted","Data":"8f883d70f829b819b9a105418663115155ae3ea9d88bd59b21760d449dac7e2a"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.250908 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" event={"ID":"3b6337db-6800-4222-97ac-c9df1a8aeaec","Type":"ContainerStarted","Data":"c15d2f6dd017e8dd5eb87ea5271a0d9204a8f8361e17281567531ff87125be9a"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.254054 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" event={"ID":"770c3e14-c910-4422-82c5-d6671f4a91ea","Type":"ContainerStarted","Data":"16dd080eac4048a4ff310d7add98b409d2c8588c427766fc5f06bb21c9b710db"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.255826 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerStarted","Data":"6e71008c70a4a148e11f669194a4876c1e37a9077f104ae176f93ed92f9b2ec4"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.259735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" event={"ID":"a2ab5d13-17f8-401b-8b7c-cb95a5e3b498","Type":"ContainerStarted","Data":"4e3ccf792e3927a87a1ab98d677b368890b298957a8f009b2e43ca3c8695a787"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.261139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" event={"ID":"0d171e82-72d4-4c27-ae71-83e36994e5d8","Type":"ContainerStarted","Data":"8270c87b9b98194f437fb9525831a3ea8a9f4a95c2fd55a433d756ff69221129"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.276499 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fwzcf" event={"ID":"7ef7b70e-3331-4d26-b1ea-c18699b6688a","Type":"ContainerStarted","Data":"df3babc8def1150f4f7500d7a3d813fa331caa3b311eaff82f7ea2a4344415dc"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.287160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" event={"ID":"bd75cbc2-2e9e-4522-abff-eca6f0f29678","Type":"ContainerStarted","Data":"c7d2230a731ca541399a062580e133cd77948979cca6a5ff2aa47b5ce1503799"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.288587 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" podStartSLOduration=120.288573614 podStartE2EDuration="2m0.288573614s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.286976506 +0000 UTC m=+144.931894516" watchObservedRunningTime="2026-02-17 17:49:53.288573614 +0000 UTC m=+144.933491624" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.299012 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.299185 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.79915882 +0000 UTC m=+145.444076830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.299310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.299575 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.799564793 +0000 UTC m=+145.444482803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.303844 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" event={"ID":"48ea904c-39ba-449b-bb94-2aa5a0821e9c","Type":"ContainerStarted","Data":"12e758b17f2375b8d1b277fda51ba318062c05a8939246a27849a66459eaa6c4"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.310013 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" event={"ID":"8e22d9a0-7641-44e3-a07f-d07216f7c07c","Type":"ContainerStarted","Data":"d679c49d3562dac4ff1698ec214a4a8795a20395e7c43725770887752b8d4ca8"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.310065 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" event={"ID":"8e22d9a0-7641-44e3-a07f-d07216f7c07c","Type":"ContainerStarted","Data":"02f2765aef905f5b594aec426400d1060033f618571e3cf270463fda191d2e66"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.321266 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8v48w" podStartSLOduration=120.321248751 podStartE2EDuration="2m0.321248751s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.31919172 +0000 UTC m=+144.964109750" watchObservedRunningTime="2026-02-17 17:49:53.321248751 +0000 UTC m=+144.966166761" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.332354 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" event={"ID":"5ab4aadf-cfd3-40b6-b921-2dc992ef8a75","Type":"ContainerStarted","Data":"8a646ba56d9706999cfef9cbf9e9996bd0bcdb06ccc0529ccb37ddd065ae5106"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.337456 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-l8z85" podStartSLOduration=120.337444315 podStartE2EDuration="2m0.337444315s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.336148867 +0000 UTC m=+144.981066877" watchObservedRunningTime="2026-02-17 17:49:53.337444315 +0000 UTC m=+144.982362325" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.348856 4762 patch_prober.go:28] interesting pod/apiserver-76f77b778f-nsnbr container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]log ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]etcd ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 17 17:49:53 crc kubenswrapper[4762]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 17 17:49:53 crc kubenswrapper[4762]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 17 17:49:53 crc kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 17:49:53 crc kubenswrapper[4762]: livez check failed Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.348921 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" podUID="8b52d7ad-b700-4bdb-87bb-94a66d8aaac2" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.352991 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" event={"ID":"f3d196f2-462b-4413-8cc8-c7c7a1dfa866","Type":"ContainerStarted","Data":"8d923c91ddd40fc7d21c8d69df74e028e613c8e7991703617a6622e8664faf3f"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.361890 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" event={"ID":"cb4785f9-dceb-48d1-8d9a-3f7c24f08c44","Type":"ContainerStarted","Data":"0ac523ba0e28183f971abf37cb1cd7b2f3ef87da9a97f76bdc3a3fe7a9d89fa8"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.377540 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:53 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:53 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:53 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.377601 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.393509 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfnk7" podStartSLOduration=120.393491502 podStartE2EDuration="2m0.393491502s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.362547106 +0000 UTC m=+145.007465126" watchObservedRunningTime="2026-02-17 17:49:53.393491502 +0000 UTC m=+145.038409512" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.402513 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.404010 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:53.903985116 +0000 UTC m=+145.548903136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.408673 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" event={"ID":"fdc035aa-511f-402d-b235-b5fe70abcfd2","Type":"ContainerStarted","Data":"c1b2f66e8e3b8e3eb38d892f01019a80c62890b409276cdbd35f0aa8d4ce2679"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.433186 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" event={"ID":"3bcd89b8-e038-4635-b0e3-f4b45607811b","Type":"ContainerStarted","Data":"ead55976b52727f2007705a9db7c0e9499e2c33d599595f0973db4ba8f319e24"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.433234 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" event={"ID":"3bcd89b8-e038-4635-b0e3-f4b45607811b","Type":"ContainerStarted","Data":"8259348d29a7ae0e8921bc8a713e9138157275de9858198e654119e1170f95dd"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.436816 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" event={"ID":"084cdb6a-4e10-40fd-b651-d628bc556172","Type":"ContainerStarted","Data":"74c2689b65313d411e8f20586f087b5fb39ba95b56204bdb12ac97a6af4c95d5"} Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.437378 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-5r5v9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.437426 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5r5v9" podUID="60395c5c-944a-4aa8-a01d-c8619c2295ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.439361 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gk69" podStartSLOduration=121.439338613 podStartE2EDuration="2m1.439338613s" podCreationTimestamp="2026-02-17 17:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.43521373 +0000 UTC m=+145.080131740" watchObservedRunningTime="2026-02-17 17:49:53.439338613 +0000 UTC m=+145.084256623" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.439545 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8fkq" podStartSLOduration=120.439540129 podStartE2EDuration="2m0.439540129s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.395467631 +0000 UTC m=+145.040385641" watchObservedRunningTime="2026-02-17 17:49:53.439540129 +0000 UTC m=+145.084458139" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.498745 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.515412 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.518205 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.018191802 +0000 UTC m=+145.663109812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.538026 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7lmj7" podStartSLOduration=120.538010254 podStartE2EDuration="2m0.538010254s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:53.463820335 +0000 UTC m=+145.108738345" watchObservedRunningTime="2026-02-17 17:49:53.538010254 +0000 UTC m=+145.182928264" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.617469 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.617614 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.117585495 +0000 UTC m=+145.762503505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.618448 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.11843499 +0000 UTC m=+145.763352990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.618758 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.720090 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.720493 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.220474062 +0000 UTC m=+145.865392072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.821382 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.821856 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.321841734 +0000 UTC m=+145.966759744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:53 crc kubenswrapper[4762]: I0217 17:49:53.923931 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:53 crc kubenswrapper[4762]: E0217 17:49:53.924349 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.424329919 +0000 UTC m=+146.069247919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.025751 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.026248 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.526229497 +0000 UTC m=+146.171147567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.126901 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.127101 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.627064453 +0000 UTC m=+146.271982463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.127189 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.127496 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.627484856 +0000 UTC m=+146.272402926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.218420 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.218799 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.220142 4762 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-gfq6k container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.220204 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" podUID="a2ab5d13-17f8-401b-8b7c-cb95a5e3b498" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.228754 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.228927 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.728902149 +0000 UTC m=+146.373820159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.229059 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.229334 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.729322452 +0000 UTC m=+146.374240462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.330291 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.330394 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.830371824 +0000 UTC m=+146.475289834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.330550 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.330923 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.83091299 +0000 UTC m=+146.475831000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.374522 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:54 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:54 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:54 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.374602 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.431268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.431418 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.931396336 +0000 UTC m=+146.576314346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.431526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.431863 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:54.931854559 +0000 UTC m=+146.576772569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.444773 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" event={"ID":"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8","Type":"ContainerStarted","Data":"506dc31db5da58f4d47b187c5bfe45569b302ee3655f1250fd9bc69020535b21"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.444822 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" event={"ID":"4d10a1bb-fd22-4e00-9ee5-465663cfa3c8","Type":"ContainerStarted","Data":"a1385dc7311255f38fc4f852be84259b6d59b53a9319f22c86a2a9a71eb02b48"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.444901 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.446477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" event={"ID":"770c3e14-c910-4422-82c5-d6671f4a91ea","Type":"ContainerStarted","Data":"9ac9b1bcd14ea2ebf9a2fc77d4a2d9a14cb4d4b97bc4ea9e63cbec34a68298bb"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.446526 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" event={"ID":"770c3e14-c910-4422-82c5-d6671f4a91ea","Type":"ContainerStarted","Data":"2fc5898b36f93540cb604fddc5e56547d5515f881de822cf47f43bdc3041d24a"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.447902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" event={"ID":"bd75cbc2-2e9e-4522-abff-eca6f0f29678","Type":"ContainerStarted","Data":"23278c883ada787fc685ac1e56cb21f492e711e350ba0090b0d8d8dc11e9d26a"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.462628 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" event={"ID":"a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb","Type":"ContainerStarted","Data":"5e18181aae656fb40f432fdce0d1e4ccf6e7956397688bddaf712cca18f7af89"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.462971 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.464725 4762 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nbm9w container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.464778 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" podUID="a30ad21d-8ada-4dbb-b5b0-bc7d0ad38feb" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.472798 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" event={"ID":"c10f8307-650e-49cb-a376-3781d37517b1","Type":"ContainerStarted","Data":"34e1b8526639465f41e55cc94830e4b90370e336bbf1ada437a8f6e63a23b1e2"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.472858 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" event={"ID":"c10f8307-650e-49cb-a376-3781d37517b1","Type":"ContainerStarted","Data":"4bfe708bf572a5b6540bbd095f02f3c047c6e75727e000bb98ced41bb03f54c6"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.479205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" event={"ID":"ed085297-7845-4e38-bd40-80bcf2e1ca15","Type":"ContainerStarted","Data":"e504d31d6aff31a1462ee222c7398cd23cc0dc6821b6ef4c737de1bcc3b3aa4b"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.480327 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.481859 4762 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d6szx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.481897 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" podUID="ed085297-7845-4e38-bd40-80bcf2e1ca15" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.483530 4762 csr.go:261] certificate signing request csr-nrw54 is approved, waiting to be issued Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.486255 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" event={"ID":"084cdb6a-4e10-40fd-b651-d628bc556172","Type":"ContainerStarted","Data":"0784fa5f0eee9911fbd1ab2c4665ce7ee9ca8c07ad3ef015038d7d824e833c2a"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.486307 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" event={"ID":"084cdb6a-4e10-40fd-b651-d628bc556172","Type":"ContainerStarted","Data":"36543c6156567cbaef5ea0dffb64ae20f2f3d2b3813f2453c15e618a073b8673"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.494896 4762 csr.go:257] certificate signing request csr-nrw54 is issued Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.499288 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" event={"ID":"266896ca-532c-45be-b263-727feed4415f","Type":"ContainerStarted","Data":"80edb6b402e3997dea8659061b8af7a4fcde9bd53c667385e0f46dd323a40e0d"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.505233 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" podStartSLOduration=121.505218554 podStartE2EDuration="2m1.505218554s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.497255045 +0000 UTC m=+146.142173055" watchObservedRunningTime="2026-02-17 17:49:54.505218554 +0000 UTC m=+146.150136564" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.514440 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" event={"ID":"3b6337db-6800-4222-97ac-c9df1a8aeaec","Type":"ContainerStarted","Data":"1d4ee1c0ed3f61ceb1d4ad9daffa160945080a8fcab158fc6b7a67112050f23b"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.525318 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" event={"ID":"fdc035aa-511f-402d-b235-b5fe70abcfd2","Type":"ContainerStarted","Data":"f47fbc93f2a7bfd5a317ee88adf785eb6c4f7d47bac89d1007bcac6d42056b25"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.526335 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.527829 4762 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q2ktl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.527884 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" podUID="fdc035aa-511f-402d-b235-b5fe70abcfd2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.533014 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.534420 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.034401467 +0000 UTC m=+146.679319477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.539644 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" podStartSLOduration=121.539611562 podStartE2EDuration="2m1.539611562s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.538845259 +0000 UTC m=+146.183763269" watchObservedRunningTime="2026-02-17 17:49:54.539611562 +0000 UTC m=+146.184529572" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.549180 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" event={"ID":"49bda643-ddd8-4dd8-854a-7ec3d0f960ea","Type":"ContainerStarted","Data":"06bfab9540eaaf6c9569b14a89dcfd4186d858ec3c1860fdf5bb7aa8c384cb9c"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.553746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" event={"ID":"5e14e621-40b7-4585-b793-dfd0337aec04","Type":"ContainerStarted","Data":"3b4016ed8d14e5d2a0e411f79d8a66bdba6999e18f8c3001670356cb5d2193ee"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.555193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cnsj" event={"ID":"3fbf6589-961a-45b8-8b4f-0210b879497c","Type":"ContainerStarted","Data":"5153968c676a1b019d6f0138dd329d7a1889e7a85e1ab26438afb59702442da7"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.559456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" event={"ID":"ddab6d46-4abb-415c-a416-e8131610b68d","Type":"ContainerStarted","Data":"05bd6bd3f2966ef45357bcba196c47e20d0bc4404b5208b6a08ee9f0fac20509"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.567735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fwzcf" event={"ID":"7ef7b70e-3331-4d26-b1ea-c18699b6688a","Type":"ContainerStarted","Data":"40af04de4008f2f5b8df902170d3e04e22391662d3a5a655664c8e1735bbea2e"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.567771 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fwzcf" event={"ID":"7ef7b70e-3331-4d26-b1ea-c18699b6688a","Type":"ContainerStarted","Data":"6981147d942dcd9d50050956f379daf646caa39824cfabdd732bd5e598d2f093"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.568206 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fwzcf" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.580126 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerStarted","Data":"1a0143c62ed1a62e452f6b8b766a823202c59d3d24486e467723cd1a6b4adaa2"} Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.580164 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.582359 4762 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2gktn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.582407 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.601809 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" podStartSLOduration=121.601787712 podStartE2EDuration="2m1.601787712s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.572373952 +0000 UTC m=+146.217291962" watchObservedRunningTime="2026-02-17 17:49:54.601787712 +0000 UTC m=+146.246705722" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.603843 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6x8mb" podStartSLOduration=121.603834403 podStartE2EDuration="2m1.603834403s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.60038647 +0000 UTC m=+146.245304480" watchObservedRunningTime="2026-02-17 17:49:54.603834403 +0000 UTC m=+146.248752413" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.636697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.637960 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.137948434 +0000 UTC m=+146.782866444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.655302 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g7zcx" podStartSLOduration=121.655285262 podStartE2EDuration="2m1.655285262s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.653043325 +0000 UTC m=+146.297961425" watchObservedRunningTime="2026-02-17 17:49:54.655285262 +0000 UTC m=+146.300203272" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.670172 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8p4kj" podStartSLOduration=121.670153447 podStartE2EDuration="2m1.670153447s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.668103876 +0000 UTC m=+146.313021886" watchObservedRunningTime="2026-02-17 17:49:54.670153447 +0000 UTC m=+146.315071457" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.694794 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2zhrk" podStartSLOduration=121.694780803 podStartE2EDuration="2m1.694780803s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.692382382 +0000 UTC m=+146.337300392" watchObservedRunningTime="2026-02-17 17:49:54.694780803 +0000 UTC m=+146.339698813" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.741207 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.742811 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.24279429 +0000 UTC m=+146.887712300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.755291 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8cnsj" podStartSLOduration=7.7552641829999995 podStartE2EDuration="7.755264183s" podCreationTimestamp="2026-02-17 17:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.755196951 +0000 UTC m=+146.400114961" watchObservedRunningTime="2026-02-17 17:49:54.755264183 +0000 UTC m=+146.400182193" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.755448 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4ttdt" podStartSLOduration=121.755442998 podStartE2EDuration="2m1.755442998s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.721596006 +0000 UTC m=+146.366514016" watchObservedRunningTime="2026-02-17 17:49:54.755442998 +0000 UTC m=+146.400361008" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.781886 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2z554" podStartSLOduration=121.781863588 podStartE2EDuration="2m1.781863588s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.780549859 +0000 UTC m=+146.425467879" watchObservedRunningTime="2026-02-17 17:49:54.781863588 +0000 UTC m=+146.426781598" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.841306 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" podStartSLOduration=121.841291066 podStartE2EDuration="2m1.841291066s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.80800546 +0000 UTC m=+146.452923470" watchObservedRunningTime="2026-02-17 17:49:54.841291066 +0000 UTC m=+146.486209076" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.841764 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" podStartSLOduration=121.84176034 podStartE2EDuration="2m1.84176034s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.839483692 +0000 UTC m=+146.484401722" watchObservedRunningTime="2026-02-17 17:49:54.84176034 +0000 UTC m=+146.486678350" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.843309 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.843754 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.343740559 +0000 UTC m=+146.988658569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.863743 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l9p7g" podStartSLOduration=121.863724437 podStartE2EDuration="2m1.863724437s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.862902352 +0000 UTC m=+146.507820362" watchObservedRunningTime="2026-02-17 17:49:54.863724437 +0000 UTC m=+146.508642447" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.891684 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fwzcf" podStartSLOduration=8.891667462000001 podStartE2EDuration="8.891667462s" podCreationTimestamp="2026-02-17 17:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.891323932 +0000 UTC m=+146.536241942" watchObservedRunningTime="2026-02-17 17:49:54.891667462 +0000 UTC m=+146.536585472" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.916807 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" podStartSLOduration=121.916766863 podStartE2EDuration="2m1.916766863s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.914725462 +0000 UTC m=+146.559643472" watchObservedRunningTime="2026-02-17 17:49:54.916766863 +0000 UTC m=+146.561684873" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.924486 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.925309 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.932057 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.932297 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.941489 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.947606 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:54 crc kubenswrapper[4762]: E0217 17:49:54.948084 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.448067849 +0000 UTC m=+147.092985859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:54 crc kubenswrapper[4762]: I0217 17:49:54.949301 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tj4jr" podStartSLOduration=121.949290876 podStartE2EDuration="2m1.949290876s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:54.940774931 +0000 UTC m=+146.585692941" watchObservedRunningTime="2026-02-17 17:49:54.949290876 +0000 UTC m=+146.594208886" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.049001 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.049052 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.049084 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.049398 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.54937971 +0000 UTC m=+147.194297780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.149992 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.150107 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.650089542 +0000 UTC m=+147.295007552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.150435 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.150469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.150500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.150571 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.150940 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.650918987 +0000 UTC m=+147.295837067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.175499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.245184 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.252001 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.252283 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.752267507 +0000 UTC m=+147.397185517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.353765 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.354177 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.854166185 +0000 UTC m=+147.499084195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.377310 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:55 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:55 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:55 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.377384 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.455173 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.455988 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:55.95596848 +0000 UTC m=+147.600886490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.498764 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 17:44:54 +0000 UTC, rotation deadline is 2026-12-07 02:55:04.377825873 +0000 UTC Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.498815 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7017h5m8.879014092s for next certificate rotation Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.557460 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.557861 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.057849417 +0000 UTC m=+147.702767417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.611384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" event={"ID":"0d171e82-72d4-4c27-ae71-83e36994e5d8","Type":"ContainerStarted","Data":"befe62110024d346700a81fac436f99ceabdd227ace35916e6f530ba4e1a002e"} Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.617173 4762 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2gktn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.617221 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.627284 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2ktl" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.634814 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.659019 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.659467 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.159447746 +0000 UTC m=+147.804365756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.742386 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nbm9w" Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.761019 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.765831 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.265818277 +0000 UTC m=+147.910736287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.871294 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.871710 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.371688334 +0000 UTC m=+148.016606344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:55 crc kubenswrapper[4762]: I0217 17:49:55.972493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:55 crc kubenswrapper[4762]: E0217 17:49:55.972850 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.472835559 +0000 UTC m=+148.117753559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.073506 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.073684 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.573654665 +0000 UTC m=+148.218572685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.073855 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.074190 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.57417735 +0000 UTC m=+148.219095360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.175085 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.175297 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.675269054 +0000 UTC m=+148.320187064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.175781 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.176200 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.676182421 +0000 UTC m=+148.321100491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.277300 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.277566 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.777533753 +0000 UTC m=+148.422451763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.277659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.278135 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.7781242 +0000 UTC m=+148.423042280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.376220 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:56 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:56 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:56 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.376292 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.378800 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.378949 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.878928155 +0000 UTC m=+148.523846165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.379128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.379441 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.879433021 +0000 UTC m=+148.524351031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.479998 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.480162 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.980139243 +0000 UTC m=+148.625057263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.480227 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.480557 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:56.980545275 +0000 UTC m=+148.625463295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.518108 4762 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.581635 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.581797 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.081766772 +0000 UTC m=+148.726684782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.581856 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.582238 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.082227116 +0000 UTC m=+148.727145126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.614060 4762 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d6szx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.614122 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" podUID="ed085297-7845-4e38-bd40-80bcf2e1ca15" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.617294 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa0fe35d-31da-415c-9a29-1ce3bc06cc58","Type":"ContainerStarted","Data":"2baab68257a12b86cabd0aa74da790b05e00c3df7ff699a27e32765bfa9bd709"} Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.617375 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa0fe35d-31da-415c-9a29-1ce3bc06cc58","Type":"ContainerStarted","Data":"8be9ed6b0a515a1c52e156a473b2c7a21a26eca6840a9726b9f8768ba2e83aec"} Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.619083 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" event={"ID":"0d171e82-72d4-4c27-ae71-83e36994e5d8","Type":"ContainerStarted","Data":"7956bae1810da75820efbda12f7a81207eb1d613dab36408c85253f1c79c0fbf"} Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.632767 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.632746077 podStartE2EDuration="2.632746077s" podCreationTimestamp="2026-02-17 17:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:56.629997165 +0000 UTC m=+148.274915175" watchObservedRunningTime="2026-02-17 17:49:56.632746077 +0000 UTC m=+148.277664087" Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.682726 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.682849 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.182827625 +0000 UTC m=+148.827745645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.682991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.683277 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.183268938 +0000 UTC m=+148.828186948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.784263 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.784446 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.284419014 +0000 UTC m=+148.929337024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.784667 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.786289 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.286280149 +0000 UTC m=+148.931198159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.886717 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.886850 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.386824687 +0000 UTC m=+149.031742697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.886932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:56 crc kubenswrapper[4762]: E0217 17:49:56.887281 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.38727359 +0000 UTC m=+149.032191600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:56 crc kubenswrapper[4762]: I0217 17:49:56.968227 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d6szx" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988180 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:57 crc kubenswrapper[4762]: E0217 17:49:56.988304 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.488281751 +0000 UTC m=+149.133199771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988379 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988428 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988473 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988572 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.988643 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:57 crc kubenswrapper[4762]: E0217 17:49:56.989042 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.489025654 +0000 UTC m=+149.133943664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.993731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.995391 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.995520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:56.996158 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.054383 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.065267 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.074000 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.074261 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.075154 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.077518 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.093020 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:57 crc kubenswrapper[4762]: E0217 17:49:57.093140 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.593122767 +0000 UTC m=+149.238040777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.093565 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: E0217 17:49:57.094161 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.594144948 +0000 UTC m=+149.239062958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zc64c" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.132808 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.201917 4762 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T17:49:56.51817902Z","Handler":null,"Name":""} Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.202910 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.203079 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58wf\" (UniqueName: \"kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.203184 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.203245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: E0217 17:49:57.203345 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:49:57.703326713 +0000 UTC m=+149.348244723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.215050 4762 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.215227 4762 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.216494 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.238106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.248961 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.265811 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.308381 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.308432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.308488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.308525 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58wf\" (UniqueName: \"kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.310285 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.310336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.311962 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.311994 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.341172 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58wf\" (UniqueName: \"kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf\") pod \"community-operators-jnlvk\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.345608 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zc64c\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.376817 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:57 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:57 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:57 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.376875 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.409757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.410342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.410378 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.410405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbdb\" (UniqueName: \"kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.415790 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.416659 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.425107 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.427916 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.437692 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.496078 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511610 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511690 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511724 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511747 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbdb\" (UniqueName: \"kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.511796 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszf7\" (UniqueName: \"kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.512585 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.512861 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.534993 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbdb\" (UniqueName: \"kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb\") pod \"certified-operators-zfghh\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: W0217 17:49:57.570588 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-23070e361946735bd1a5c9c75671bfa2b0ff8463b49854c21583d0a7a39eb7c9 WatchSource:0}: Error finding container 23070e361946735bd1a5c9c75671bfa2b0ff8463b49854c21583d0a7a39eb7c9: Status 404 returned error can't find the container with id 23070e361946735bd1a5c9c75671bfa2b0ff8463b49854c21583d0a7a39eb7c9 Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.575331 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.612843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.612903 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.612927 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszf7\" (UniqueName: \"kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.613573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.613707 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.619524 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.621168 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.640462 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszf7\" (UniqueName: \"kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7\") pod \"community-operators-jb99z\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.651732 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.678048 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" event={"ID":"0d171e82-72d4-4c27-ae71-83e36994e5d8","Type":"ContainerStarted","Data":"64b43ed593947c098988dced38cd7f5f1d613e6bbc65d80b0ff7e11cba532089"} Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.678095 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" event={"ID":"0d171e82-72d4-4c27-ae71-83e36994e5d8","Type":"ContainerStarted","Data":"2a10bb3685067a86b951fd10cc828173efca1e035055d4787422a39120ae34f7"} Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.679897 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"23070e361946735bd1a5c9c75671bfa2b0ff8463b49854c21583d0a7a39eb7c9"} Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.682077 4762 generic.go:334] "Generic (PLEG): container finished" podID="fa0fe35d-31da-415c-9a29-1ce3bc06cc58" containerID="2baab68257a12b86cabd0aa74da790b05e00c3df7ff699a27e32765bfa9bd709" exitCode=0 Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.682320 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa0fe35d-31da-415c-9a29-1ce3bc06cc58","Type":"ContainerDied","Data":"2baab68257a12b86cabd0aa74da790b05e00c3df7ff699a27e32765bfa9bd709"} Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.711849 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-l7mfh" podStartSLOduration=11.711825333 podStartE2EDuration="11.711825333s" podCreationTimestamp="2026-02-17 17:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:57.710325028 +0000 UTC m=+149.355243038" watchObservedRunningTime="2026-02-17 17:49:57.711825333 +0000 UTC m=+149.356743343" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.715110 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.715178 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.715201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7fz\" (UniqueName: \"kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.723943 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.739272 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:49:57 crc kubenswrapper[4762]: W0217 17:49:57.773779 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0697342_ade9_480a_9ac9_074416d620ef.slice/crio-fdfb4f69a5c7e02e4e931f5d83eebf18bda30fe70e127059e07dce3fa7ea7d20 WatchSource:0}: Error finding container fdfb4f69a5c7e02e4e931f5d83eebf18bda30fe70e127059e07dce3fa7ea7d20: Status 404 returned error can't find the container with id fdfb4f69a5c7e02e4e931f5d83eebf18bda30fe70e127059e07dce3fa7ea7d20 Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.789603 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.816058 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.816135 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.816157 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7fz\" (UniqueName: \"kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.818020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.818077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: W0217 17:49:57.824966 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15469884_f0fd_4460_97dd_6a428a3e7e0d.slice/crio-4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f WatchSource:0}: Error finding container 4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f: Status 404 returned error can't find the container with id 4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.843857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7fz\" (UniqueName: \"kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz\") pod \"certified-operators-9s9f7\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.867934 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.952407 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.972851 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.978066 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nsnbr" Feb 17 17:49:57 crc kubenswrapper[4762]: I0217 17:49:57.997820 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.298718 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.374352 4762 patch_prober.go:28] interesting pod/router-default-5444994796-pvxtx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:49:58 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 17:49:58 crc kubenswrapper[4762]: [+]process-running ok Feb 17 17:49:58 crc kubenswrapper[4762]: healthz check failed Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.374760 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pvxtx" podUID="86c83b85-567c-43f9-ac88-e332e05bea98" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.689978 4762 generic.go:334] "Generic (PLEG): container finished" podID="5e14e621-40b7-4585-b793-dfd0337aec04" containerID="3b4016ed8d14e5d2a0e411f79d8a66bdba6999e18f8c3001670356cb5d2193ee" exitCode=0 Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.690045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" event={"ID":"5e14e621-40b7-4585-b793-dfd0337aec04","Type":"ContainerDied","Data":"3b4016ed8d14e5d2a0e411f79d8a66bdba6999e18f8c3001670356cb5d2193ee"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.691916 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" event={"ID":"15469884-f0fd-4460-97dd-6a428a3e7e0d","Type":"ContainerStarted","Data":"ad519fee8a6ce38d7504d01ac7cccd506064da17ceebec5d294ce9e0d1b98172"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.691945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" event={"ID":"15469884-f0fd-4460-97dd-6a428a3e7e0d","Type":"ContainerStarted","Data":"4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.692017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.693425 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cbdd7feb57cc5a4fcd5b751e55a9c2e33fffc4512b1e194c07891cc85cfde96b"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.693472 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd29d035136356dd46b08b2ffdc380e7676ad29e5822d4c2bf47176d91a56d6c"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.693652 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.695769 4762 generic.go:334] "Generic (PLEG): container finished" podID="55b54250-be7b-4b98-9716-68be885af4d1" containerID="98a3b98b91d9fb457bd604a173f6f9b366876637c0f8ec3f1e092a2f558e9b59" exitCode=0 Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.695823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerDied","Data":"98a3b98b91d9fb457bd604a173f6f9b366876637c0f8ec3f1e092a2f558e9b59"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.695877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerStarted","Data":"e862dcf35f4fd7ee664d2c9311473b014deca2ddebbbe28f5309ac8b4217fda5"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.697340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"857d5b910b8476774f67be1e3df38221b96467ce23da950e1dd699426eb559e7"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.697404 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.699212 4762 generic.go:334] "Generic (PLEG): container finished" podID="a0697342-ade9-480a-9ac9-074416d620ef" containerID="bf9e30eac18fa99baf62c0ac945a097e4770c92ce3f7a92a21342329fbb44a8d" exitCode=0 Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.699275 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerDied","Data":"bf9e30eac18fa99baf62c0ac945a097e4770c92ce3f7a92a21342329fbb44a8d"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.699293 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerStarted","Data":"fdfb4f69a5c7e02e4e931f5d83eebf18bda30fe70e127059e07dce3fa7ea7d20"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.701016 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerID="bd04ac27ea97d97b3ea9a7964120f18f23be95322afc27b8e14d0d1c3f73977e" exitCode=0 Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.701078 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerDied","Data":"bd04ac27ea97d97b3ea9a7964120f18f23be95322afc27b8e14d0d1c3f73977e"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.701100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerStarted","Data":"7addc00f060e7698fc1eee97822d05d624f0f3708d34283f09d46c4af3fb062f"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.705304 4762 generic.go:334] "Generic (PLEG): container finished" podID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerID="fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4" exitCode=0 Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.705408 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerDied","Data":"fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.705438 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerStarted","Data":"23092a08fa6c1bfe6d3beea536190accd6c380f031affaa0433092a58e7940d4"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.710915 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0a480a8001442056fb716cbd4633caad37c0bdbf5b9fb399e2e5f1ea1e50c2fb"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.710974 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"70fc5b39a92e3d1feec3b2866a327f49669c9502e8dc2a5413153afa208810b1"} Feb 17 17:49:58 crc kubenswrapper[4762]: I0217 17:49:58.899346 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" podStartSLOduration=125.8993173 podStartE2EDuration="2m5.8993173s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:58.890836846 +0000 UTC m=+150.535754866" watchObservedRunningTime="2026-02-17 17:49:58.8993173 +0000 UTC m=+150.544235310" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.017809 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.018998 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.027875 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.044994 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.046756 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.049748 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.151733 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access\") pod \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.151773 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir\") pod \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\" (UID: \"fa0fe35d-31da-415c-9a29-1ce3bc06cc58\") " Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.151943 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa0fe35d-31da-415c-9a29-1ce3bc06cc58" (UID: "fa0fe35d-31da-415c-9a29-1ce3bc06cc58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.152180 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.152315 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ktv\" (UniqueName: \"kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.152382 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.152530 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.160070 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa0fe35d-31da-415c-9a29-1ce3bc06cc58" (UID: "fa0fe35d-31da-415c-9a29-1ce3bc06cc58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.196401 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.196460 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.198841 4762 patch_prober.go:28] interesting pod/console-f9d7485db-zfmsb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.198912 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zfmsb" podUID="c7f82eed-54cf-4b40-b996-e23d502a4f9e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.226962 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.233165 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gfq6k" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.254275 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ktv\" (UniqueName: \"kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.254375 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.254485 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.254563 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa0fe35d-31da-415c-9a29-1ce3bc06cc58-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.255079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.255156 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.271457 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ktv\" (UniqueName: \"kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv\") pod \"redhat-marketplace-fmbwb\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.287075 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-5r5v9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.287132 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5r5v9" podUID="60395c5c-944a-4aa8-a01d-c8619c2295ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.287180 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-5r5v9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.287222 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5r5v9" podUID="60395c5c-944a-4aa8-a01d-c8619c2295ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.354482 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.370833 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.389257 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.417542 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:49:59 crc kubenswrapper[4762]: E0217 17:49:59.418101 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0fe35d-31da-415c-9a29-1ce3bc06cc58" containerName="pruner" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.418123 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0fe35d-31da-415c-9a29-1ce3bc06cc58" containerName="pruner" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.418237 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0fe35d-31da-415c-9a29-1ce3bc06cc58" containerName="pruner" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.419028 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.433814 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.566848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.567252 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgw24\" (UniqueName: \"kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.567286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.668010 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.668047 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgw24\" (UniqueName: \"kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.668067 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.668558 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.668815 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.686918 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.714322 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgw24\" (UniqueName: \"kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24\") pod \"redhat-marketplace-m4hgg\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.749505 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.770384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerStarted","Data":"4e1d3ac28bf587f09158284f2be46f6f28801d663a60a66cd7eede0f363a7279"} Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.774250 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.782696 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa0fe35d-31da-415c-9a29-1ce3bc06cc58","Type":"ContainerDied","Data":"8be9ed6b0a515a1c52e156a473b2c7a21a26eca6840a9726b9f8768ba2e83aec"} Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.782763 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8be9ed6b0a515a1c52e156a473b2c7a21a26eca6840a9726b9f8768ba2e83aec" Feb 17 17:49:59 crc kubenswrapper[4762]: I0217 17:49:59.798557 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pvxtx" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.130793 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.146054 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.216554 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:50:00 crc kubenswrapper[4762]: E0217 17:50:00.216821 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e14e621-40b7-4585-b793-dfd0337aec04" containerName="collect-profiles" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.216835 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e14e621-40b7-4585-b793-dfd0337aec04" containerName="collect-profiles" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.216955 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e14e621-40b7-4585-b793-dfd0337aec04" containerName="collect-profiles" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.222764 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.225563 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.244611 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.287308 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wkkv\" (UniqueName: \"kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv\") pod \"5e14e621-40b7-4585-b793-dfd0337aec04\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.287414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") pod \"5e14e621-40b7-4585-b793-dfd0337aec04\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.287555 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume\") pod \"5e14e621-40b7-4585-b793-dfd0337aec04\" (UID: \"5e14e621-40b7-4585-b793-dfd0337aec04\") " Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.293900 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e14e621-40b7-4585-b793-dfd0337aec04" (UID: "5e14e621-40b7-4585-b793-dfd0337aec04"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.322114 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e14e621-40b7-4585-b793-dfd0337aec04" (UID: "5e14e621-40b7-4585-b793-dfd0337aec04"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.322815 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv" (OuterVolumeSpecName: "kube-api-access-4wkkv") pod "5e14e621-40b7-4585-b793-dfd0337aec04" (UID: "5e14e621-40b7-4585-b793-dfd0337aec04"). InnerVolumeSpecName "kube-api-access-4wkkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396351 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396448 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829rm\" (UniqueName: \"kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396487 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396549 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e14e621-40b7-4585-b793-dfd0337aec04-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396568 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wkkv\" (UniqueName: \"kubernetes.io/projected/5e14e621-40b7-4585-b793-dfd0337aec04-kube-api-access-4wkkv\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.396580 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e14e621-40b7-4585-b793-dfd0337aec04-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.404266 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.502393 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.502541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829rm\" (UniqueName: \"kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.502644 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.503365 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.504178 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.545028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829rm\" (UniqueName: \"kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm\") pod \"redhat-operators-jq9qr\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.572973 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.643041 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.653768 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.672920 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.811258 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.811325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhv9\" (UniqueName: \"kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.811391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.822903 4762 generic.go:334] "Generic (PLEG): container finished" podID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerID="15abfcc28c6d31a0e6346b99a6b6d077d8b471b2695f169e9723f2603416a423" exitCode=0 Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.822979 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerDied","Data":"15abfcc28c6d31a0e6346b99a6b6d077d8b471b2695f169e9723f2603416a423"} Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.823023 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerStarted","Data":"6144c1b3e1531171c9912a25e1a4c26fc89b4419132062cffd0ddbcb80ea1a03"} Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.838267 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.838972 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-kdv7g" event={"ID":"5e14e621-40b7-4585-b793-dfd0337aec04","Type":"ContainerDied","Data":"4bacaf123313e0901db1034a2426dfb71fb078c50a6ef2e66ac69d9e466fb118"} Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.839014 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bacaf123313e0901db1034a2426dfb71fb078c50a6ef2e66ac69d9e466fb118" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.842338 4762 generic.go:334] "Generic (PLEG): container finished" podID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerID="ff01d2d6dad4ff3c81083ed5a1e8fa149fd989ba191b5471b226a63ee4e54db1" exitCode=0 Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.843602 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerDied","Data":"ff01d2d6dad4ff3c81083ed5a1e8fa149fd989ba191b5471b226a63ee4e54db1"} Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.912553 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.912614 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhv9\" (UniqueName: \"kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.912707 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.913734 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.914815 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:00 crc kubenswrapper[4762]: I0217 17:50:00.953868 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhv9\" (UniqueName: \"kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9\") pod \"redhat-operators-6m7l2\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.064006 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.162190 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.471366 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:01 crc kubenswrapper[4762]: W0217 17:50:01.558012 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4c88c9_be13_4f25_8975_d09ad5affc6f.slice/crio-a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547 WatchSource:0}: Error finding container a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547: Status 404 returned error can't find the container with id a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547 Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.851529 4762 generic.go:334] "Generic (PLEG): container finished" podID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerID="4f339e29a8012b009ecc4488734d51cbd8dd75c370e8921e0185c93679ffae6a" exitCode=0 Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.851616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerDied","Data":"4f339e29a8012b009ecc4488734d51cbd8dd75c370e8921e0185c93679ffae6a"} Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.851673 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerStarted","Data":"a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547"} Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.857360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerDied","Data":"ec139cdcb22d72a50ae1252262a6b3428847883c4a765f1f52cfb241145e9eab"} Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.857263 4762 generic.go:334] "Generic (PLEG): container finished" podID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerID="ec139cdcb22d72a50ae1252262a6b3428847883c4a765f1f52cfb241145e9eab" exitCode=0 Feb 17 17:50:01 crc kubenswrapper[4762]: I0217 17:50:01.857801 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerStarted","Data":"efaebd4160cef745bbc3d3f9b1a7cc6c5b222fadc5cc8c3e004af40fd624c74b"} Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.660291 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.661110 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.665592 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.666152 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.667340 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.752153 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.752329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.853314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.853397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.853554 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.886002 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:02 crc kubenswrapper[4762]: I0217 17:50:02.987468 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:03 crc kubenswrapper[4762]: I0217 17:50:03.633593 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:50:03 crc kubenswrapper[4762]: W0217 17:50:03.680397 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d3f4d5f_b526_4b5d_b93a_8b66c4fa343c.slice/crio-3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604 WatchSource:0}: Error finding container 3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604: Status 404 returned error can't find the container with id 3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604 Feb 17 17:50:03 crc kubenswrapper[4762]: I0217 17:50:03.884445 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c","Type":"ContainerStarted","Data":"3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604"} Feb 17 17:50:04 crc kubenswrapper[4762]: I0217 17:50:04.558973 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:50:04 crc kubenswrapper[4762]: I0217 17:50:04.559378 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:50:04 crc kubenswrapper[4762]: I0217 17:50:04.828019 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fwzcf" Feb 17 17:50:04 crc kubenswrapper[4762]: I0217 17:50:04.912928 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c","Type":"ContainerStarted","Data":"59fd4b14c7c1e2cd34c5d8fcdc005cc89913a75abf7b19bebbf5a5b96d4d89a9"} Feb 17 17:50:05 crc kubenswrapper[4762]: I0217 17:50:05.938889 4762 generic.go:334] "Generic (PLEG): container finished" podID="4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" containerID="59fd4b14c7c1e2cd34c5d8fcdc005cc89913a75abf7b19bebbf5a5b96d4d89a9" exitCode=0 Feb 17 17:50:05 crc kubenswrapper[4762]: I0217 17:50:05.938960 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c","Type":"ContainerDied","Data":"59fd4b14c7c1e2cd34c5d8fcdc005cc89913a75abf7b19bebbf5a5b96d4d89a9"} Feb 17 17:50:09 crc kubenswrapper[4762]: I0217 17:50:09.207749 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:50:09 crc kubenswrapper[4762]: I0217 17:50:09.226127 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zfmsb" Feb 17 17:50:09 crc kubenswrapper[4762]: I0217 17:50:09.307367 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5r5v9" Feb 17 17:50:15 crc kubenswrapper[4762]: I0217 17:50:15.552908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:50:15 crc kubenswrapper[4762]: I0217 17:50:15.559581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bb87d75-4230-44b9-8ee8-7aff6d051904-metrics-certs\") pod \"network-metrics-daemon-wdzt7\" (UID: \"6bb87d75-4230-44b9-8ee8-7aff6d051904\") " pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:50:15 crc kubenswrapper[4762]: I0217 17:50:15.682518 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdzt7" Feb 17 17:50:17 crc kubenswrapper[4762]: I0217 17:50:17.502417 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.480022 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.588824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access\") pod \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.588892 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir\") pod \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\" (UID: \"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c\") " Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.589124 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" (UID: "4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.594069 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" (UID: "4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.690069 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:18 crc kubenswrapper[4762]: I0217 17:50:18.690103 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:19 crc kubenswrapper[4762]: I0217 17:50:19.045582 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c","Type":"ContainerDied","Data":"3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604"} Feb 17 17:50:19 crc kubenswrapper[4762]: I0217 17:50:19.045641 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e5088c1c3f611de78fffdaab5aed181f16061fdcceb1e37c8c819063ac20604" Feb 17 17:50:19 crc kubenswrapper[4762]: I0217 17:50:19.045735 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:50:27 crc kubenswrapper[4762]: I0217 17:50:27.079912 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:50:27 crc kubenswrapper[4762]: E0217 17:50:27.562890 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 17:50:27 crc kubenswrapper[4762]: E0217 17:50:27.563396 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fb7fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9s9f7_openshift-marketplace(55b54250-be7b-4b98-9716-68be885af4d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:50:27 crc kubenswrapper[4762]: E0217 17:50:27.564687 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9s9f7" podUID="55b54250-be7b-4b98-9716-68be885af4d1" Feb 17 17:50:30 crc kubenswrapper[4762]: I0217 17:50:30.078555 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gdlw4" Feb 17 17:50:30 crc kubenswrapper[4762]: E0217 17:50:30.429149 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9s9f7" podUID="55b54250-be7b-4b98-9716-68be885af4d1" Feb 17 17:50:31 crc kubenswrapper[4762]: E0217 17:50:31.929997 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 17:50:31 crc kubenswrapper[4762]: E0217 17:50:31.931606 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mszf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jb99z_openshift-marketplace(68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:50:31 crc kubenswrapper[4762]: E0217 17:50:31.932897 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jb99z" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" Feb 17 17:50:32 crc kubenswrapper[4762]: E0217 17:50:32.087217 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 17:50:32 crc kubenswrapper[4762]: E0217 17:50:32.087335 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k58wf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jnlvk_openshift-marketplace(a0697342-ade9-480a-9ac9-074416d620ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:50:32 crc kubenswrapper[4762]: E0217 17:50:32.088504 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jnlvk" podUID="a0697342-ade9-480a-9ac9-074416d620ef" Feb 17 17:50:32 crc kubenswrapper[4762]: E0217 17:50:32.148075 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jnlvk" podUID="a0697342-ade9-480a-9ac9-074416d620ef" Feb 17 17:50:32 crc kubenswrapper[4762]: E0217 17:50:32.150371 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jb99z" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" Feb 17 17:50:32 crc kubenswrapper[4762]: I0217 17:50:32.361024 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdzt7"] Feb 17 17:50:32 crc kubenswrapper[4762]: W0217 17:50:32.369906 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb87d75_4230_44b9_8ee8_7aff6d051904.slice/crio-e7d3f3682f224a9d910646afd7fe6e98859459d1d6f1f0a61ef9f69167415c6a WatchSource:0}: Error finding container e7d3f3682f224a9d910646afd7fe6e98859459d1d6f1f0a61ef9f69167415c6a: Status 404 returned error can't find the container with id e7d3f3682f224a9d910646afd7fe6e98859459d1d6f1f0a61ef9f69167415c6a Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.152328 4762 generic.go:334] "Generic (PLEG): container finished" podID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerID="aa754b1de96634d012912429bb5011050977887a3b025cf46c997c6256418552" exitCode=0 Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.152387 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerDied","Data":"aa754b1de96634d012912429bb5011050977887a3b025cf46c997c6256418552"} Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.153483 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" event={"ID":"6bb87d75-4230-44b9-8ee8-7aff6d051904","Type":"ContainerStarted","Data":"e7d3f3682f224a9d910646afd7fe6e98859459d1d6f1f0a61ef9f69167415c6a"} Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.158149 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerID="13f373d199d9720fb011274c1a7c147fc8c17a8ac8f46db551f34a22f49cee72" exitCode=0 Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.158221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerDied","Data":"13f373d199d9720fb011274c1a7c147fc8c17a8ac8f46db551f34a22f49cee72"} Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.165230 4762 generic.go:334] "Generic (PLEG): container finished" podID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerID="93345c0d8d8a7b37c9868176c969f16bfb9692f09119dd11c8c3bcd3dcd0c1a6" exitCode=0 Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.165337 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerDied","Data":"93345c0d8d8a7b37c9868176c969f16bfb9692f09119dd11c8c3bcd3dcd0c1a6"} Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.168510 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerStarted","Data":"b039f6f5b600dcf452e209a62b9643b19075606739c353d1f38d5f39d4da7aa3"} Feb 17 17:50:33 crc kubenswrapper[4762]: I0217 17:50:33.178340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerStarted","Data":"83e03edf99f0f22e0ce37174107eccc07ac400b1839aa52f369039f2afec1d7d"} Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.187247 4762 generic.go:334] "Generic (PLEG): container finished" podID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerID="b039f6f5b600dcf452e209a62b9643b19075606739c353d1f38d5f39d4da7aa3" exitCode=0 Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.187291 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerDied","Data":"b039f6f5b600dcf452e209a62b9643b19075606739c353d1f38d5f39d4da7aa3"} Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.190577 4762 generic.go:334] "Generic (PLEG): container finished" podID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerID="83e03edf99f0f22e0ce37174107eccc07ac400b1839aa52f369039f2afec1d7d" exitCode=0 Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.190602 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerDied","Data":"83e03edf99f0f22e0ce37174107eccc07ac400b1839aa52f369039f2afec1d7d"} Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.558956 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:50:34 crc kubenswrapper[4762]: I0217 17:50:34.559034 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:50:35 crc kubenswrapper[4762]: I0217 17:50:35.196755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" event={"ID":"6bb87d75-4230-44b9-8ee8-7aff6d051904","Type":"ContainerStarted","Data":"476e8ec6f040e9d6dc82f86f7d91f13427f869cfd2cf321caa9721337c332e1e"} Feb 17 17:50:37 crc kubenswrapper[4762]: I0217 17:50:37.208567 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdzt7" event={"ID":"6bb87d75-4230-44b9-8ee8-7aff6d051904","Type":"ContainerStarted","Data":"097771aaa16fbf29369c7adb81d4d7f182cd123b3bd9f36b3d2c8004cb976e56"} Feb 17 17:50:37 crc kubenswrapper[4762]: I0217 17:50:37.225009 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wdzt7" podStartSLOduration=164.224989212 podStartE2EDuration="2m44.224989212s" podCreationTimestamp="2026-02-17 17:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:50:37.221794097 +0000 UTC m=+188.866712107" watchObservedRunningTime="2026-02-17 17:50:37.224989212 +0000 UTC m=+188.869907222" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.046610 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:50:38 crc kubenswrapper[4762]: E0217 17:50:38.048263 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" containerName="pruner" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.048279 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" containerName="pruner" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.048429 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3f4d5f-b526-4b5d-b93a-8b66c4fa343c" containerName="pruner" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.051186 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.053412 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.054228 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.054352 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.234961 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.235089 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.336888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.336996 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.337163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.356392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.374338 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:38 crc kubenswrapper[4762]: I0217 17:50:38.746319 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:50:39 crc kubenswrapper[4762]: I0217 17:50:39.373587 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerStarted","Data":"090d7a1886c00978a221f82900cdc10774825743f5c2f80e6e946d59b359601b"} Feb 17 17:50:39 crc kubenswrapper[4762]: I0217 17:50:39.376188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5ec359e3-da36-4417-ad0f-be50ec2375b9","Type":"ContainerStarted","Data":"322897493a473c88b9af6a4a12b81d8813e3b56417c98bad5bf2ee64799df301"} Feb 17 17:50:39 crc kubenswrapper[4762]: I0217 17:50:39.376221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5ec359e3-da36-4417-ad0f-be50ec2375b9","Type":"ContainerStarted","Data":"5d7a67c4e49bf600f1f0d5a03a10fc406738f91af58ec1aa64267f3aea071d37"} Feb 17 17:50:39 crc kubenswrapper[4762]: I0217 17:50:39.396992 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zfghh" podStartSLOduration=3.377687674 podStartE2EDuration="42.396966592s" podCreationTimestamp="2026-02-17 17:49:57 +0000 UTC" firstStartedPulling="2026-02-17 17:49:58.70472146 +0000 UTC m=+150.349639480" lastFinishedPulling="2026-02-17 17:50:37.724000368 +0000 UTC m=+189.368918398" observedRunningTime="2026-02-17 17:50:39.393376025 +0000 UTC m=+191.038294055" watchObservedRunningTime="2026-02-17 17:50:39.396966592 +0000 UTC m=+191.041884622" Feb 17 17:50:40 crc kubenswrapper[4762]: I0217 17:50:40.386481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerStarted","Data":"c5cbaae27108ad4ba815a202093e3ad495655b0875b89ed1c31598e8ab418dee"} Feb 17 17:50:40 crc kubenswrapper[4762]: I0217 17:50:40.388445 4762 generic.go:334] "Generic (PLEG): container finished" podID="5ec359e3-da36-4417-ad0f-be50ec2375b9" containerID="322897493a473c88b9af6a4a12b81d8813e3b56417c98bad5bf2ee64799df301" exitCode=0 Feb 17 17:50:40 crc kubenswrapper[4762]: I0217 17:50:40.388481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5ec359e3-da36-4417-ad0f-be50ec2375b9","Type":"ContainerDied","Data":"322897493a473c88b9af6a4a12b81d8813e3b56417c98bad5bf2ee64799df301"} Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.427994 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fmbwb" podStartSLOduration=3.809715702 podStartE2EDuration="42.427967433s" podCreationTimestamp="2026-02-17 17:49:59 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.845237762 +0000 UTC m=+152.490155772" lastFinishedPulling="2026-02-17 17:50:39.463489493 +0000 UTC m=+191.108407503" observedRunningTime="2026-02-17 17:50:41.422351805 +0000 UTC m=+193.067269815" watchObservedRunningTime="2026-02-17 17:50:41.427967433 +0000 UTC m=+193.072885463" Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.824017 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.986244 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access\") pod \"5ec359e3-da36-4417-ad0f-be50ec2375b9\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.986544 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir\") pod \"5ec359e3-da36-4417-ad0f-be50ec2375b9\" (UID: \"5ec359e3-da36-4417-ad0f-be50ec2375b9\") " Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.986800 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5ec359e3-da36-4417-ad0f-be50ec2375b9" (UID: "5ec359e3-da36-4417-ad0f-be50ec2375b9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:50:41 crc kubenswrapper[4762]: I0217 17:50:41.992934 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5ec359e3-da36-4417-ad0f-be50ec2375b9" (UID: "5ec359e3-da36-4417-ad0f-be50ec2375b9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:42 crc kubenswrapper[4762]: I0217 17:50:42.087696 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec359e3-da36-4417-ad0f-be50ec2375b9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:42 crc kubenswrapper[4762]: I0217 17:50:42.087730 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ec359e3-da36-4417-ad0f-be50ec2375b9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:42 crc kubenswrapper[4762]: I0217 17:50:42.406046 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5ec359e3-da36-4417-ad0f-be50ec2375b9","Type":"ContainerDied","Data":"5d7a67c4e49bf600f1f0d5a03a10fc406738f91af58ec1aa64267f3aea071d37"} Feb 17 17:50:42 crc kubenswrapper[4762]: I0217 17:50:42.406087 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:50:42 crc kubenswrapper[4762]: I0217 17:50:42.406092 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d7a67c4e49bf600f1f0d5a03a10fc406738f91af58ec1aa64267f3aea071d37" Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.417227 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerStarted","Data":"b286020766826e60062e0258bc83f8fa82c0007f96df249d737245497cb93b55"} Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.421591 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerStarted","Data":"9dcdd00effdadf1d349ce93fb8d2a98e3aa4612aeb337b6f07bdd3f76796e97f"} Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.425429 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerStarted","Data":"4a17eb1bbde5204fb8d646d0ccbb260444c437077729dcb94f7fb63dcf808e70"} Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.434495 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4hgg" podStartSLOduration=3.451824665 podStartE2EDuration="44.434473606s" podCreationTimestamp="2026-02-17 17:49:59 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.830879183 +0000 UTC m=+152.475797183" lastFinishedPulling="2026-02-17 17:50:41.813528114 +0000 UTC m=+193.458446124" observedRunningTime="2026-02-17 17:50:43.433998922 +0000 UTC m=+195.078916932" watchObservedRunningTime="2026-02-17 17:50:43.434473606 +0000 UTC m=+195.079391616" Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.454022 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6m7l2" podStartSLOduration=3.5515966690000003 podStartE2EDuration="43.45398107s" podCreationTimestamp="2026-02-17 17:50:00 +0000 UTC" firstStartedPulling="2026-02-17 17:50:01.853996854 +0000 UTC m=+153.498914874" lastFinishedPulling="2026-02-17 17:50:41.756381265 +0000 UTC m=+193.401299275" observedRunningTime="2026-02-17 17:50:43.450671621 +0000 UTC m=+195.095589631" watchObservedRunningTime="2026-02-17 17:50:43.45398107 +0000 UTC m=+195.098899080" Feb 17 17:50:43 crc kubenswrapper[4762]: I0217 17:50:43.468063 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jq9qr" podStartSLOduration=3.5702790970000002 podStartE2EDuration="43.46804292s" podCreationTimestamp="2026-02-17 17:50:00 +0000 UTC" firstStartedPulling="2026-02-17 17:50:01.85920876 +0000 UTC m=+153.504126770" lastFinishedPulling="2026-02-17 17:50:41.756972583 +0000 UTC m=+193.401890593" observedRunningTime="2026-02-17 17:50:43.464465803 +0000 UTC m=+195.109383823" watchObservedRunningTime="2026-02-17 17:50:43.46804292 +0000 UTC m=+195.112960930" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.450600 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:50:45 crc kubenswrapper[4762]: E0217 17:50:45.452713 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec359e3-da36-4417-ad0f-be50ec2375b9" containerName="pruner" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.452837 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec359e3-da36-4417-ad0f-be50ec2375b9" containerName="pruner" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.453066 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec359e3-da36-4417-ad0f-be50ec2375b9" containerName="pruner" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.453613 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.456672 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.457816 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.457891 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.632609 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.632728 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.632771 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.733701 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.733809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.733844 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.734227 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.734261 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.754365 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access\") pod \"installer-9-crc\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:45 crc kubenswrapper[4762]: I0217 17:50:45.774138 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:50:46 crc kubenswrapper[4762]: I0217 17:50:46.238467 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:50:46 crc kubenswrapper[4762]: I0217 17:50:46.441228 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"28c4e4c3-1636-4e35-bd78-c3139a2fb077","Type":"ContainerStarted","Data":"dea8702919dea2e0b89e6089ba00b080dfa431ae45118eac0a7f8063972030e5"} Feb 17 17:50:47 crc kubenswrapper[4762]: I0217 17:50:47.451935 4762 generic.go:334] "Generic (PLEG): container finished" podID="55b54250-be7b-4b98-9716-68be885af4d1" containerID="75c88f020437e0a85bd3d03ad53ae5cf41072d7ea281f171e83b7554f71b1103" exitCode=0 Feb 17 17:50:47 crc kubenswrapper[4762]: I0217 17:50:47.452049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerDied","Data":"75c88f020437e0a85bd3d03ad53ae5cf41072d7ea281f171e83b7554f71b1103"} Feb 17 17:50:47 crc kubenswrapper[4762]: I0217 17:50:47.455854 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"28c4e4c3-1636-4e35-bd78-c3139a2fb077","Type":"ContainerStarted","Data":"e11ac2cfcca599ce823eb3c81adfe666090e58be4c7a6616cd887d963e5784de"} Feb 17 17:50:47 crc kubenswrapper[4762]: I0217 17:50:47.576988 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:50:47 crc kubenswrapper[4762]: I0217 17:50:47.577081 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:50:48 crc kubenswrapper[4762]: I0217 17:50:48.075289 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:50:48 crc kubenswrapper[4762]: I0217 17:50:48.094715 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.094694823 podStartE2EDuration="3.094694823s" podCreationTimestamp="2026-02-17 17:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:50:47.501174761 +0000 UTC m=+199.146092791" watchObservedRunningTime="2026-02-17 17:50:48.094694823 +0000 UTC m=+199.739612843" Feb 17 17:50:48 crc kubenswrapper[4762]: I0217 17:50:48.500045 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.355007 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.355430 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.439265 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.509300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.750855 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.750926 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:49 crc kubenswrapper[4762]: I0217 17:50:49.921510 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:50 crc kubenswrapper[4762]: I0217 17:50:50.530685 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:50 crc kubenswrapper[4762]: I0217 17:50:50.574285 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:50 crc kubenswrapper[4762]: I0217 17:50:50.574658 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:50 crc kubenswrapper[4762]: I0217 17:50:50.617974 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.064223 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.064498 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.103910 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.484001 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerStarted","Data":"d301fc6a0923217a96c91e92185b72c20d6a78c32c45f5bc1a9eda6ecc15d3fb"} Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.506255 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9s9f7" podStartSLOduration=2.817907496 podStartE2EDuration="54.506219171s" podCreationTimestamp="2026-02-17 17:49:57 +0000 UTC" firstStartedPulling="2026-02-17 17:49:58.697176624 +0000 UTC m=+150.342094634" lastFinishedPulling="2026-02-17 17:50:50.385488299 +0000 UTC m=+202.030406309" observedRunningTime="2026-02-17 17:50:51.503661394 +0000 UTC m=+203.148579424" watchObservedRunningTime="2026-02-17 17:50:51.506219171 +0000 UTC m=+203.151137181" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.522884 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.523248 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:50:51 crc kubenswrapper[4762]: I0217 17:50:51.858304 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:50:52 crc kubenswrapper[4762]: I0217 17:50:52.488491 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4hgg" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="registry-server" containerID="cri-o://b286020766826e60062e0258bc83f8fa82c0007f96df249d737245497cb93b55" gracePeriod=2 Feb 17 17:50:53 crc kubenswrapper[4762]: I0217 17:50:53.657574 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:54 crc kubenswrapper[4762]: I0217 17:50:54.503158 4762 generic.go:334] "Generic (PLEG): container finished" podID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerID="b286020766826e60062e0258bc83f8fa82c0007f96df249d737245497cb93b55" exitCode=0 Feb 17 17:50:54 crc kubenswrapper[4762]: I0217 17:50:54.503256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerDied","Data":"b286020766826e60062e0258bc83f8fa82c0007f96df249d737245497cb93b55"} Feb 17 17:50:54 crc kubenswrapper[4762]: I0217 17:50:54.503699 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6m7l2" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="registry-server" containerID="cri-o://4a17eb1bbde5204fb8d646d0ccbb260444c437077729dcb94f7fb63dcf808e70" gracePeriod=2 Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.511797 4762 generic.go:334] "Generic (PLEG): container finished" podID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerID="4a17eb1bbde5204fb8d646d0ccbb260444c437077729dcb94f7fb63dcf808e70" exitCode=0 Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.512055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerDied","Data":"4a17eb1bbde5204fb8d646d0ccbb260444c437077729dcb94f7fb63dcf808e70"} Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.512078 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m7l2" event={"ID":"9c4c88c9-be13-4f25-8975-d09ad5affc6f","Type":"ContainerDied","Data":"a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547"} Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.512090 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2187bdc62dac94dc5b7fc795e31ccc5af2a4fde564d9e2a4eac97ff96cbf547" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.512671 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.514922 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4hgg" event={"ID":"49fb31e9-5d77-487b-bcdb-647dafb291fb","Type":"ContainerDied","Data":"6144c1b3e1531171c9912a25e1a4c26fc89b4419132062cffd0ddbcb80ea1a03"} Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.514956 4762 scope.go:117] "RemoveContainer" containerID="b286020766826e60062e0258bc83f8fa82c0007f96df249d737245497cb93b55" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.520211 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.534705 4762 scope.go:117] "RemoveContainer" containerID="93345c0d8d8a7b37c9868176c969f16bfb9692f09119dd11c8c3bcd3dcd0c1a6" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.551350 4762 scope.go:117] "RemoveContainer" containerID="15abfcc28c6d31a0e6346b99a6b6d077d8b471b2695f169e9723f2603416a423" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691070 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content\") pod \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities\") pod \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691142 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xhv9\" (UniqueName: \"kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9\") pod \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\" (UID: \"9c4c88c9-be13-4f25-8975-d09ad5affc6f\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691181 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgw24\" (UniqueName: \"kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24\") pod \"49fb31e9-5d77-487b-bcdb-647dafb291fb\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691210 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content\") pod \"49fb31e9-5d77-487b-bcdb-647dafb291fb\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.691253 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities\") pod \"49fb31e9-5d77-487b-bcdb-647dafb291fb\" (UID: \"49fb31e9-5d77-487b-bcdb-647dafb291fb\") " Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.692407 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities" (OuterVolumeSpecName: "utilities") pod "9c4c88c9-be13-4f25-8975-d09ad5affc6f" (UID: "9c4c88c9-be13-4f25-8975-d09ad5affc6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.697221 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24" (OuterVolumeSpecName: "kube-api-access-rgw24") pod "49fb31e9-5d77-487b-bcdb-647dafb291fb" (UID: "49fb31e9-5d77-487b-bcdb-647dafb291fb"). InnerVolumeSpecName "kube-api-access-rgw24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.697431 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9" (OuterVolumeSpecName: "kube-api-access-4xhv9") pod "9c4c88c9-be13-4f25-8975-d09ad5affc6f" (UID: "9c4c88c9-be13-4f25-8975-d09ad5affc6f"). InnerVolumeSpecName "kube-api-access-4xhv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.699514 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities" (OuterVolumeSpecName: "utilities") pod "49fb31e9-5d77-487b-bcdb-647dafb291fb" (UID: "49fb31e9-5d77-487b-bcdb-647dafb291fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.722099 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49fb31e9-5d77-487b-bcdb-647dafb291fb" (UID: "49fb31e9-5d77-487b-bcdb-647dafb291fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.792245 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.792283 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xhv9\" (UniqueName: \"kubernetes.io/projected/9c4c88c9-be13-4f25-8975-d09ad5affc6f-kube-api-access-4xhv9\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.792297 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgw24\" (UniqueName: \"kubernetes.io/projected/49fb31e9-5d77-487b-bcdb-647dafb291fb-kube-api-access-rgw24\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.792307 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.792318 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fb31e9-5d77-487b-bcdb-647dafb291fb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.853095 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c4c88c9-be13-4f25-8975-d09ad5affc6f" (UID: "9c4c88c9-be13-4f25-8975-d09ad5affc6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:50:55 crc kubenswrapper[4762]: I0217 17:50:55.893795 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4c88c9-be13-4f25-8975-d09ad5affc6f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.524723 4762 generic.go:334] "Generic (PLEG): container finished" podID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerID="befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d" exitCode=0 Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.524837 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerDied","Data":"befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d"} Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.527837 4762 generic.go:334] "Generic (PLEG): container finished" podID="a0697342-ade9-480a-9ac9-074416d620ef" containerID="93b51f809932e401efbccc816311187c7bb2de250bd0d63898e4a7fbcc395ad9" exitCode=0 Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.527944 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerDied","Data":"93b51f809932e401efbccc816311187c7bb2de250bd0d63898e4a7fbcc395ad9"} Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.529677 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4hgg" Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.529720 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m7l2" Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.579785 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.583815 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4hgg"] Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.588678 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:56 crc kubenswrapper[4762]: I0217 17:50:56.593544 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6m7l2"] Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.044480 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" path="/var/lib/kubelet/pods/49fb31e9-5d77-487b-bcdb-647dafb291fb/volumes" Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.045253 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" path="/var/lib/kubelet/pods/9c4c88c9-be13-4f25-8975-d09ad5affc6f/volumes" Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.536653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerStarted","Data":"ebf9a27c2db6c94ac0f551cb66113404401f158ea4c94a05c31955b9bed29539"} Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.953655 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.953713 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:50:57 crc kubenswrapper[4762]: I0217 17:50:57.991991 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:50:58 crc kubenswrapper[4762]: I0217 17:50:58.543953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerStarted","Data":"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04"} Feb 17 17:50:58 crc kubenswrapper[4762]: I0217 17:50:58.559416 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jb99z" podStartSLOduration=2.597297558 podStartE2EDuration="1m1.55940049s" podCreationTimestamp="2026-02-17 17:49:57 +0000 UTC" firstStartedPulling="2026-02-17 17:49:58.706784792 +0000 UTC m=+150.351702802" lastFinishedPulling="2026-02-17 17:50:57.668887724 +0000 UTC m=+209.313805734" observedRunningTime="2026-02-17 17:50:58.558364219 +0000 UTC m=+210.203282239" watchObservedRunningTime="2026-02-17 17:50:58.55940049 +0000 UTC m=+210.204318490" Feb 17 17:50:58 crc kubenswrapper[4762]: I0217 17:50:58.581492 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jnlvk" podStartSLOduration=3.020181156 podStartE2EDuration="1m1.58147015s" podCreationTimestamp="2026-02-17 17:49:57 +0000 UTC" firstStartedPulling="2026-02-17 17:49:58.700824274 +0000 UTC m=+150.345742284" lastFinishedPulling="2026-02-17 17:50:57.262113268 +0000 UTC m=+208.907031278" observedRunningTime="2026-02-17 17:50:58.577574013 +0000 UTC m=+210.222492033" watchObservedRunningTime="2026-02-17 17:50:58.58147015 +0000 UTC m=+210.226388160" Feb 17 17:50:58 crc kubenswrapper[4762]: I0217 17:50:58.586250 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:51:00 crc kubenswrapper[4762]: I0217 17:51:00.258401 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:51:00 crc kubenswrapper[4762]: I0217 17:51:00.551832 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9s9f7" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="registry-server" containerID="cri-o://d301fc6a0923217a96c91e92185b72c20d6a78c32c45f5bc1a9eda6ecc15d3fb" gracePeriod=2 Feb 17 17:51:02 crc kubenswrapper[4762]: I0217 17:51:02.563385 4762 generic.go:334] "Generic (PLEG): container finished" podID="55b54250-be7b-4b98-9716-68be885af4d1" containerID="d301fc6a0923217a96c91e92185b72c20d6a78c32c45f5bc1a9eda6ecc15d3fb" exitCode=0 Feb 17 17:51:02 crc kubenswrapper[4762]: I0217 17:51:02.563453 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerDied","Data":"d301fc6a0923217a96c91e92185b72c20d6a78c32c45f5bc1a9eda6ecc15d3fb"} Feb 17 17:51:02 crc kubenswrapper[4762]: I0217 17:51:02.986438 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.088156 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7fz\" (UniqueName: \"kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz\") pod \"55b54250-be7b-4b98-9716-68be885af4d1\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.088403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content\") pod \"55b54250-be7b-4b98-9716-68be885af4d1\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.088531 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities\") pod \"55b54250-be7b-4b98-9716-68be885af4d1\" (UID: \"55b54250-be7b-4b98-9716-68be885af4d1\") " Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.089695 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities" (OuterVolumeSpecName: "utilities") pod "55b54250-be7b-4b98-9716-68be885af4d1" (UID: "55b54250-be7b-4b98-9716-68be885af4d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.094592 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz" (OuterVolumeSpecName: "kube-api-access-fb7fz") pod "55b54250-be7b-4b98-9716-68be885af4d1" (UID: "55b54250-be7b-4b98-9716-68be885af4d1"). InnerVolumeSpecName "kube-api-access-fb7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.141747 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55b54250-be7b-4b98-9716-68be885af4d1" (UID: "55b54250-be7b-4b98-9716-68be885af4d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.190133 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7fz\" (UniqueName: \"kubernetes.io/projected/55b54250-be7b-4b98-9716-68be885af4d1-kube-api-access-fb7fz\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.190192 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.190206 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b54250-be7b-4b98-9716-68be885af4d1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.571756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9s9f7" event={"ID":"55b54250-be7b-4b98-9716-68be885af4d1","Type":"ContainerDied","Data":"e862dcf35f4fd7ee664d2c9311473b014deca2ddebbbe28f5309ac8b4217fda5"} Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.572753 4762 scope.go:117] "RemoveContainer" containerID="d301fc6a0923217a96c91e92185b72c20d6a78c32c45f5bc1a9eda6ecc15d3fb" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.572385 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9s9f7" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.590975 4762 scope.go:117] "RemoveContainer" containerID="75c88f020437e0a85bd3d03ad53ae5cf41072d7ea281f171e83b7554f71b1103" Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.596078 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.609744 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9s9f7"] Feb 17 17:51:03 crc kubenswrapper[4762]: I0217 17:51:03.631413 4762 scope.go:117] "RemoveContainer" containerID="98a3b98b91d9fb457bd604a173f6f9b366876637c0f8ec3f1e092a2f558e9b59" Feb 17 17:51:04 crc kubenswrapper[4762]: I0217 17:51:04.558751 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:51:04 crc kubenswrapper[4762]: I0217 17:51:04.559202 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:51:04 crc kubenswrapper[4762]: I0217 17:51:04.559343 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:51:04 crc kubenswrapper[4762]: I0217 17:51:04.559953 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:51:04 crc kubenswrapper[4762]: I0217 17:51:04.560151 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba" gracePeriod=600 Feb 17 17:51:05 crc kubenswrapper[4762]: I0217 17:51:05.046060 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b54250-be7b-4b98-9716-68be885af4d1" path="/var/lib/kubelet/pods/55b54250-be7b-4b98-9716-68be885af4d1/volumes" Feb 17 17:51:05 crc kubenswrapper[4762]: I0217 17:51:05.586339 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba" exitCode=0 Feb 17 17:51:05 crc kubenswrapper[4762]: I0217 17:51:05.586428 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba"} Feb 17 17:51:05 crc kubenswrapper[4762]: I0217 17:51:05.586709 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871"} Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.438920 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.439417 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.477532 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.642203 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.741226 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.741279 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:07 crc kubenswrapper[4762]: I0217 17:51:07.785159 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:08 crc kubenswrapper[4762]: I0217 17:51:08.642320 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:08 crc kubenswrapper[4762]: I0217 17:51:08.726931 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9bp9t"] Feb 17 17:51:09 crc kubenswrapper[4762]: I0217 17:51:09.457030 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:51:10 crc kubenswrapper[4762]: I0217 17:51:10.615777 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jb99z" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="registry-server" containerID="cri-o://666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04" gracePeriod=2 Feb 17 17:51:10 crc kubenswrapper[4762]: I0217 17:51:10.996222 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.191939 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mszf7\" (UniqueName: \"kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7\") pod \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.192005 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content\") pod \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.192063 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities\") pod \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\" (UID: \"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484\") " Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.192903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities" (OuterVolumeSpecName: "utilities") pod "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" (UID: "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.197366 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7" (OuterVolumeSpecName: "kube-api-access-mszf7") pod "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" (UID: "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484"). InnerVolumeSpecName "kube-api-access-mszf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.258316 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" (UID: "68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.296122 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.296457 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.296478 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mszf7\" (UniqueName: \"kubernetes.io/projected/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484-kube-api-access-mszf7\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.622070 4762 generic.go:334] "Generic (PLEG): container finished" podID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerID="666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04" exitCode=0 Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.622114 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerDied","Data":"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04"} Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.622139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb99z" event={"ID":"68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484","Type":"ContainerDied","Data":"23092a08fa6c1bfe6d3beea536190accd6c380f031affaa0433092a58e7940d4"} Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.622154 4762 scope.go:117] "RemoveContainer" containerID="666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.622294 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb99z" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.643804 4762 scope.go:117] "RemoveContainer" containerID="befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.650281 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.670067 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jb99z"] Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.675514 4762 scope.go:117] "RemoveContainer" containerID="fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.693158 4762 scope.go:117] "RemoveContainer" containerID="666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04" Feb 17 17:51:11 crc kubenswrapper[4762]: E0217 17:51:11.693707 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04\": container with ID starting with 666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04 not found: ID does not exist" containerID="666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.693763 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04"} err="failed to get container status \"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04\": rpc error: code = NotFound desc = could not find container \"666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04\": container with ID starting with 666427862d1f7f24562fb01c032372268b9dd1851a9875c3d6818fa1cbce6a04 not found: ID does not exist" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.693798 4762 scope.go:117] "RemoveContainer" containerID="befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d" Feb 17 17:51:11 crc kubenswrapper[4762]: E0217 17:51:11.694774 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d\": container with ID starting with befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d not found: ID does not exist" containerID="befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.694809 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d"} err="failed to get container status \"befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d\": rpc error: code = NotFound desc = could not find container \"befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d\": container with ID starting with befd3940245b7f5bce51492086cecba058e7c3d15cc0b20d3f4ecb72002c295d not found: ID does not exist" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.694831 4762 scope.go:117] "RemoveContainer" containerID="fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4" Feb 17 17:51:11 crc kubenswrapper[4762]: E0217 17:51:11.695159 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4\": container with ID starting with fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4 not found: ID does not exist" containerID="fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4" Feb 17 17:51:11 crc kubenswrapper[4762]: I0217 17:51:11.695214 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4"} err="failed to get container status \"fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4\": rpc error: code = NotFound desc = could not find container \"fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4\": container with ID starting with fe50d4e8b8a51cb777fdde6110e351de07e3dee68b467853282f5f40fbbf1eb4 not found: ID does not exist" Feb 17 17:51:13 crc kubenswrapper[4762]: I0217 17:51:13.041515 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" path="/var/lib/kubelet/pods/68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484/volumes" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.096960 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097800 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097813 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097825 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097834 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097842 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097848 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097858 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097863 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097872 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097877 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097889 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097895 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097903 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097909 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097918 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097923 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="extract-content" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097932 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097939 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097947 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097952 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097962 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097970 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.097977 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.097982 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="extract-utilities" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.098075 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fb31e9-5d77-487b-bcdb-647dafb291fb" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.098092 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b54250-be7b-4b98-9716-68be885af4d1" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.098108 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e9fa9a-c0d2-4e9d-b77b-b3bb98b88484" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.098117 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4c88c9-be13-4f25-8975-d09ad5affc6f" containerName="registry-server" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.098541 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.140912 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.162951 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163046 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163342 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163362 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163372 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163380 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163392 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163399 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163410 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163419 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163433 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163442 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163454 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163461 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.163475 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.163482 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164040 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164055 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164071 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164079 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164088 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.164097 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165548 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498" gracePeriod=15 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165685 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72" gracePeriod=15 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165723 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6" gracePeriod=15 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165798 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba" gracePeriod=15 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165674 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff" gracePeriod=15 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.165961 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.166072 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.166127 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.166213 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.166317 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.267149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.267430 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.267511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.269840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.270059 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.270386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.270415 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271542 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271579 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271668 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271718 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.271798 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.372794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.372929 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.373258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.373344 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.373560 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.373648 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.439081 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:51:24 crc kubenswrapper[4762]: W0217 17:51:24.471311 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-3f33d621a43de1c4781e5baae31457137d0eb3137aa3033c18143f4def30c997 WatchSource:0}: Error finding container 3f33d621a43de1c4781e5baae31457137d0eb3137aa3033c18143f4def30c997: Status 404 returned error can't find the container with id 3f33d621a43de1c4781e5baae31457137d0eb3137aa3033c18143f4def30c997 Feb 17 17:51:24 crc kubenswrapper[4762]: E0217 17:51:24.475316 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18951a1000bf5b8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:51:24.474497932 +0000 UTC m=+236.119415952,LastTimestamp:2026-02-17 17:51:24.474497932 +0000 UTC m=+236.119415952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.694305 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.695739 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.696562 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff" exitCode=0 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.696595 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba" exitCode=0 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.696609 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72" exitCode=0 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.696642 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6" exitCode=2 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.696694 4762 scope.go:117] "RemoveContainer" containerID="8ad8c27a1ae901caa7797d6530eeb085580a681f5e8a1d8e6fc07df4a8e95aa0" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.702309 4762 generic.go:334] "Generic (PLEG): container finished" podID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" containerID="e11ac2cfcca599ce823eb3c81adfe666090e58be4c7a6616cd887d963e5784de" exitCode=0 Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.702369 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"28c4e4c3-1636-4e35-bd78-c3139a2fb077","Type":"ContainerDied","Data":"e11ac2cfcca599ce823eb3c81adfe666090e58be4c7a6616cd887d963e5784de"} Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.703769 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.704392 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.704926 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:24 crc kubenswrapper[4762]: I0217 17:51:24.705781 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3f33d621a43de1c4781e5baae31457137d0eb3137aa3033c18143f4def30c997"} Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.712669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cd3dc69ae9bea17fcca6cdada676ea1fec275c9bb7d4edc03fa33e73d1a77f6c"} Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.713595 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.714131 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.715931 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.935724 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.936353 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.936883 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.996867 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir\") pod \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.996938 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access\") pod \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.996994 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock\") pod \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\" (UID: \"28c4e4c3-1636-4e35-bd78-c3139a2fb077\") " Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.997025 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28c4e4c3-1636-4e35-bd78-c3139a2fb077" (UID: "28c4e4c3-1636-4e35-bd78-c3139a2fb077"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.997186 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:25 crc kubenswrapper[4762]: I0217 17:51:25.997184 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock" (OuterVolumeSpecName: "var-lock") pod "28c4e4c3-1636-4e35-bd78-c3139a2fb077" (UID: "28c4e4c3-1636-4e35-bd78-c3139a2fb077"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.003881 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28c4e4c3-1636-4e35-bd78-c3139a2fb077" (UID: "28c4e4c3-1636-4e35-bd78-c3139a2fb077"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.098495 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28c4e4c3-1636-4e35-bd78-c3139a2fb077-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.098536 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/28c4e4c3-1636-4e35-bd78-c3139a2fb077-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.723293 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"28c4e4c3-1636-4e35-bd78-c3139a2fb077","Type":"ContainerDied","Data":"dea8702919dea2e0b89e6089ba00b080dfa431ae45118eac0a7f8063972030e5"} Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.723540 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dea8702919dea2e0b89e6089ba00b080dfa431ae45118eac0a7f8063972030e5" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.723351 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.726729 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.727912 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498" exitCode=0 Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.745850 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:26 crc kubenswrapper[4762]: I0217 17:51:26.746357 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.064972 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.065778 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.066349 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.066606 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.066867 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250200 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250300 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250317 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250419 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250868 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250898 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.250910 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.744123 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.746107 4762 scope.go:117] "RemoveContainer" containerID="da0d7ce2f7ef68247eab63dec449aae3616235a80e23d5a9e7457f053116c5ff" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.746252 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.766980 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.767289 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.767541 4762 scope.go:117] "RemoveContainer" containerID="3b84e4fb530cef6f171257dc7bc6b577dbcb63437fbadc95afb009fb41c8f3ba" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.767612 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.780912 4762 scope.go:117] "RemoveContainer" containerID="62b137ac973716892fe1327d6ffa886ab0d641cb8ccc7079c7a534bb1276bc72" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.795978 4762 scope.go:117] "RemoveContainer" containerID="08ce2d5f26af67c2ed8d01520a08c527eb332fdf3f59104be8d375cda3f61ae6" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.810846 4762 scope.go:117] "RemoveContainer" containerID="3c4e77d01aaf541e0f0d038ef15fbe5635e77bbc1f8f8b67ad407cd0072c3498" Feb 17 17:51:27 crc kubenswrapper[4762]: I0217 17:51:27.823348 4762 scope.go:117] "RemoveContainer" containerID="4729f7173dcee9cce0f25304de6b4c4a49e1ede200a2a7585ab52c3b8fcc6163" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.694429 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.695306 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.695702 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.696051 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.696338 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: I0217 17:51:28.696368 4762 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.696585 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.897762 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.925279 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:51:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:51:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:51:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:51:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.925489 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.925687 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.925979 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.926205 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:28 crc kubenswrapper[4762]: E0217 17:51:28.926225 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:51:29 crc kubenswrapper[4762]: I0217 17:51:29.038979 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:29 crc kubenswrapper[4762]: I0217 17:51:29.039358 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:29 crc kubenswrapper[4762]: I0217 17:51:29.039801 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:29 crc kubenswrapper[4762]: I0217 17:51:29.045928 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 17:51:29 crc kubenswrapper[4762]: E0217 17:51:29.299161 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Feb 17 17:51:30 crc kubenswrapper[4762]: E0217 17:51:30.100297 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Feb 17 17:51:30 crc kubenswrapper[4762]: E0217 17:51:30.918154 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18951a1000bf5b8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:51:24.474497932 +0000 UTC m=+236.119415952,LastTimestamp:2026-02-17 17:51:24.474497932 +0000 UTC m=+236.119415952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 17:51:31 crc kubenswrapper[4762]: E0217 17:51:31.701052 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Feb 17 17:51:33 crc kubenswrapper[4762]: I0217 17:51:33.766066 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerName="oauth-openshift" containerID="cri-o://7ff0c4858cc2ca577111d16ac1ffb9274d3d7c6b641ecd3b631382229e2e109b" gracePeriod=15 Feb 17 17:51:33 crc kubenswrapper[4762]: I0217 17:51:33.946826 4762 generic.go:334] "Generic (PLEG): container finished" podID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerID="7ff0c4858cc2ca577111d16ac1ffb9274d3d7c6b641ecd3b631382229e2e109b" exitCode=0 Feb 17 17:51:33 crc kubenswrapper[4762]: I0217 17:51:33.946871 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" event={"ID":"35fb25d5-f8ca-43c5-ae4d-31da698c4780","Type":"ContainerDied","Data":"7ff0c4858cc2ca577111d16ac1ffb9274d3d7c6b641ecd3b631382229e2e109b"} Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.137911 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.139483 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.140282 4762 status_manager.go:851] "Failed to get status for pod" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9bp9t\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.140949 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243496 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243593 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpbph\" (UniqueName: \"kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243671 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243715 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243764 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243801 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243852 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.243884 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244008 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244245 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244374 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244432 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.244509 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs\") pod \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\" (UID: \"35fb25d5-f8ca-43c5-ae4d-31da698c4780\") " Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.246188 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.246315 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.246521 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.246528 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.247019 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.250970 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph" (OuterVolumeSpecName: "kube-api-access-kpbph") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "kube-api-access-kpbph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.251121 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.251302 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.251770 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.252062 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.252332 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.252776 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.255049 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.257107 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "35fb25d5-f8ca-43c5-ae4d-31da698c4780" (UID: "35fb25d5-f8ca-43c5-ae4d-31da698c4780"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345733 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345768 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345785 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345796 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345807 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345817 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35fb25d5-f8ca-43c5-ae4d-31da698c4780-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345825 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345834 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345844 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345853 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpbph\" (UniqueName: \"kubernetes.io/projected/35fb25d5-f8ca-43c5-ae4d-31da698c4780-kube-api-access-kpbph\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345863 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345871 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345882 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.345890 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35fb25d5-f8ca-43c5-ae4d-31da698c4780-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 17:51:34 crc kubenswrapper[4762]: E0217 17:51:34.902398 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="6.4s" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.953936 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" event={"ID":"35fb25d5-f8ca-43c5-ae4d-31da698c4780","Type":"ContainerDied","Data":"cc9851817ed4863190d0e316155f0a7e9041b513e10ef5c15e35ffeab066ea7e"} Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.954006 4762 scope.go:117] "RemoveContainer" containerID="7ff0c4858cc2ca577111d16ac1ffb9274d3d7c6b641ecd3b631382229e2e109b" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.954009 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.954727 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.955245 4762 status_manager.go:851] "Failed to get status for pod" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9bp9t\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.955548 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.972957 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.973396 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:34 crc kubenswrapper[4762]: I0217 17:51:34.973819 4762 status_manager.go:851] "Failed to get status for pod" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9bp9t\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.035560 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.036328 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.036875 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.037460 4762 status_manager.go:851] "Failed to get status for pod" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9bp9t\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.051386 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.051422 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:35 crc kubenswrapper[4762]: E0217 17:51:35.051918 4762 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.052365 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.961850 4762 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="94660780fab7a2782bcbe7aa0cb253f46f02cd6821926d5a36fdf0967d59ba88" exitCode=0 Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.961952 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"94660780fab7a2782bcbe7aa0cb253f46f02cd6821926d5a36fdf0967d59ba88"} Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.962235 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d27d4ebb8ba13327024ce724518e1009095cb9cc4556a7fa628b60d0a0799638"} Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.962521 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.962534 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:35 crc kubenswrapper[4762]: E0217 17:51:35.962991 4762 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.963683 4762 status_manager.go:851] "Failed to get status for pod" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.964115 4762 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:35 crc kubenswrapper[4762]: I0217 17:51:35.964485 4762 status_manager.go:851] "Failed to get status for pod" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" pod="openshift-authentication/oauth-openshift-558db77b4-9bp9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9bp9t\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 17 17:51:36 crc kubenswrapper[4762]: I0217 17:51:36.974310 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf37cc7ca7d2ef7d085670d67114a6a68f6a5a42e4bccef28cc072ffffa1c567"} Feb 17 17:51:36 crc kubenswrapper[4762]: I0217 17:51:36.976709 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"79a132c0480bcd70444619c26304c8af0b6605992115a378f0999776d5508dab"} Feb 17 17:51:36 crc kubenswrapper[4762]: I0217 17:51:36.976734 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f7de0bbc88b71e7a4cf01c811b6c31625ed5913050ebabbf63ebbaf0dba939fc"} Feb 17 17:51:36 crc kubenswrapper[4762]: I0217 17:51:36.976764 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2471ffee4ab6b24dd17eec9723843a5a660216a5b77bd9237882f80fe0a0dce0"} Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.982283 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.982340 4762 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0" exitCode=1 Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.982408 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0"} Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.982882 4762 scope.go:117] "RemoveContainer" containerID="57a7373e6e59071fb9370194bc892392465235ccd1eb8f3487c92c3d6a6faea0" Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.986546 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c794e9a9b8fe158c11edb21a182b6f6cf449f08f9e7392c7c0037b79fc63822"} Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.986955 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.987071 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:37 crc kubenswrapper[4762]: I0217 17:51:37.987549 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:38 crc kubenswrapper[4762]: I0217 17:51:38.997145 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 17:51:38 crc kubenswrapper[4762]: I0217 17:51:38.997219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2a2bb4a3ef1f3fb570da95a20c13a6fe8ca129a64ec2a3ae86f79abc44b8e114"} Feb 17 17:51:40 crc kubenswrapper[4762]: I0217 17:51:40.052613 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:40 crc kubenswrapper[4762]: I0217 17:51:40.053275 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:40 crc kubenswrapper[4762]: I0217 17:51:40.062078 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:41 crc kubenswrapper[4762]: I0217 17:51:41.479930 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:51:43 crc kubenswrapper[4762]: I0217 17:51:43.136393 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:43 crc kubenswrapper[4762]: I0217 17:51:43.164716 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:43 crc kubenswrapper[4762]: I0217 17:51:43.164762 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:43 crc kubenswrapper[4762]: I0217 17:51:43.168907 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:51:43 crc kubenswrapper[4762]: I0217 17:51:43.236008 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="132664e6-05e9-4729-9b41-3fc4af85b912" Feb 17 17:51:44 crc kubenswrapper[4762]: I0217 17:51:44.344490 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:44 crc kubenswrapper[4762]: I0217 17:51:44.344898 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9302ca52-ca46-4bc4-8c30-c436af0f9588" Feb 17 17:51:44 crc kubenswrapper[4762]: I0217 17:51:44.347103 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="132664e6-05e9-4729-9b41-3fc4af85b912" Feb 17 17:51:44 crc kubenswrapper[4762]: I0217 17:51:44.979963 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:51:44 crc kubenswrapper[4762]: I0217 17:51:44.984261 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:51:49 crc kubenswrapper[4762]: I0217 17:51:49.033012 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 17:51:49 crc kubenswrapper[4762]: I0217 17:51:49.534932 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 17:51:49 crc kubenswrapper[4762]: I0217 17:51:49.620593 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 17:51:49 crc kubenswrapper[4762]: I0217 17:51:49.891472 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 17:51:50 crc kubenswrapper[4762]: I0217 17:51:50.225000 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:51:50 crc kubenswrapper[4762]: I0217 17:51:50.542259 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 17:51:51 crc kubenswrapper[4762]: I0217 17:51:51.053719 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 17:51:51 crc kubenswrapper[4762]: I0217 17:51:51.127650 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 17:51:51 crc kubenswrapper[4762]: I0217 17:51:51.486560 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:51:51 crc kubenswrapper[4762]: I0217 17:51:51.768051 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 17:51:52 crc kubenswrapper[4762]: I0217 17:51:52.284026 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 17:51:52 crc kubenswrapper[4762]: I0217 17:51:52.479255 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 17:51:53 crc kubenswrapper[4762]: I0217 17:51:53.068072 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 17:51:53 crc kubenswrapper[4762]: I0217 17:51:53.553342 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 17:51:53 crc kubenswrapper[4762]: I0217 17:51:53.625319 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 17:51:53 crc kubenswrapper[4762]: I0217 17:51:53.791511 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.009148 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.092947 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.449306 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.507750 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.516922 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 17:51:54 crc kubenswrapper[4762]: I0217 17:51:54.844519 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 17:51:55 crc kubenswrapper[4762]: I0217 17:51:55.137593 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 17:51:55 crc kubenswrapper[4762]: I0217 17:51:55.257347 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 17:51:55 crc kubenswrapper[4762]: I0217 17:51:55.869900 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 17:51:56 crc kubenswrapper[4762]: I0217 17:51:56.101539 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:51:56 crc kubenswrapper[4762]: I0217 17:51:56.239526 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 17:51:56 crc kubenswrapper[4762]: I0217 17:51:56.379045 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 17:51:56 crc kubenswrapper[4762]: I0217 17:51:56.593987 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 17:51:56 crc kubenswrapper[4762]: I0217 17:51:56.780680 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.166200 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.172422 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.249758 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.382346 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.419260 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.498462 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.524983 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.569734 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.670679 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.811929 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.824845 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.874882 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 17:51:57 crc kubenswrapper[4762]: I0217 17:51:57.912224 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.020183 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.023265 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.130155 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.184310 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.225482 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.263766 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.265416 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.455549 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.580109 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.636785 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.695479 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.711933 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.768791 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.796910 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 17:51:58 crc kubenswrapper[4762]: I0217 17:51:58.997218 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.040189 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.092021 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.199280 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.211804 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.298142 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.439290 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.487108 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.529734 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.576207 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.583986 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.609757 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.612430 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.718227 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 17:51:59 crc kubenswrapper[4762]: I0217 17:51:59.822003 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.005717 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.060364 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.288726 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.355148 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.391545 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.462209 4762 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.472004 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.543485 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.544219 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.633385 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.674468 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.679332 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.692123 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.708059 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.772007 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.825808 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 17:52:00 crc kubenswrapper[4762]: I0217 17:52:00.873333 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.074415 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.165432 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.167901 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.201357 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.265867 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.322450 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.451408 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.506883 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.524931 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.682492 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.703513 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.803442 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.816570 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.832361 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.866909 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.902102 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 17:52:01 crc kubenswrapper[4762]: I0217 17:52:01.929396 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.035367 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.063190 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.105837 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.184247 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.200319 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.319701 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.337613 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.364800 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.389863 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.478245 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.481756 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.483544 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.544145 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 17:52:02 crc kubenswrapper[4762]: I0217 17:52:02.770896 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.138170 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.138294 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.202287 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.287798 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.441931 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.679804 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.804131 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.834161 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.893866 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.918432 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.946710 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.951473 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:03 crc kubenswrapper[4762]: I0217 17:52:03.994276 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.039798 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.048732 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.055867 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.095151 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.135980 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.165036 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.355673 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.447988 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.485824 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.487938 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.503439 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.507788 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.602765 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.615114 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.627115 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.628662 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.777718 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.807785 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 17:52:04 crc kubenswrapper[4762]: I0217 17:52:04.974387 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.130813 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.140346 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.227844 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.252298 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.422236 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.434354 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.435084 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.450747 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.490162 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.493978 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.677967 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.678574 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.709755 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.736726 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.761382 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.874929 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.905363 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.921556 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.970503 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 17:52:05 crc kubenswrapper[4762]: I0217 17:52:05.977434 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.163238 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.214094 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.277217 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.367958 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.415237 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.441181 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.484756 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.668808 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.721203 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.840536 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.874922 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.874959 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 17:52:06 crc kubenswrapper[4762]: I0217 17:52:06.902199 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.063359 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.076515 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.082712 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.106319 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.258492 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.281510 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.297283 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.328857 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.393206 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.401801 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.469474 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.476616 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.502884 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.532141 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.625117 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.656022 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.796588 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.838701 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.870942 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 17:52:07 crc kubenswrapper[4762]: I0217 17:52:07.931691 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.154893 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.193934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.345361 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.366164 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.520901 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.523865 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.757136 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.801354 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.955130 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 17:52:08 crc kubenswrapper[4762]: I0217 17:52:08.955252 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.083756 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.084545 4762 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.085027 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.085012317 podStartE2EDuration="45.085012317s" podCreationTimestamp="2026-02-17 17:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:51:43.186934486 +0000 UTC m=+254.831852496" watchObservedRunningTime="2026-02-17 17:52:09.085012317 +0000 UTC m=+280.729930317" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.088549 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-9bp9t"] Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.088607 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.190858 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.215294 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.215273895 podStartE2EDuration="26.215273895s" podCreationTimestamp="2026-02-17 17:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:52:09.212558654 +0000 UTC m=+280.857476674" watchObservedRunningTime="2026-02-17 17:52:09.215273895 +0000 UTC m=+280.860191905" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.263045 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.298241 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.647126 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.647667 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.651179 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.652082 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-m5bdm"] Feb 17 17:52:09 crc kubenswrapper[4762]: E0217 17:52:09.652471 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerName="oauth-openshift" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.652490 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerName="oauth-openshift" Feb 17 17:52:09 crc kubenswrapper[4762]: E0217 17:52:09.652510 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" containerName="installer" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.652517 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" containerName="installer" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.652664 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c4e4c3-1636-4e35-bd78-c3139a2fb077" containerName="installer" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.652679 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" containerName="oauth-openshift" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.653161 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.655584 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.657849 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.658062 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.658939 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.661568 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.661742 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.663490 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.663759 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.663908 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.663917 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.664456 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.664826 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.665586 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.669578 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.670102 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.675693 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-m5bdm"] Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.680212 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.844713 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.844766 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.844796 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rndh\" (UniqueName: \"kubernetes.io/projected/8c360bc7-3426-4689-9140-b5f7247e9a5e-kube-api-access-9rndh\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.844827 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.844993 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845071 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-dir\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845093 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845114 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845174 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-policies\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845267 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845320 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845349 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.845379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.887861 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.889498 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-policies\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946706 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946772 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946820 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946835 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rndh\" (UniqueName: \"kubernetes.io/projected/8c360bc7-3426-4689-9140-b5f7247e9a5e-kube-api-access-9rndh\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946902 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946921 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-dir\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946938 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.946970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.947325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-dir\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.947705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-audit-policies\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.947867 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.947983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.949873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.953089 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.953090 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.953844 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.953978 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.956939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.957176 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.957519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.957559 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c360bc7-3426-4689-9140-b5f7247e9a5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.966126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rndh\" (UniqueName: \"kubernetes.io/projected/8c360bc7-3426-4689-9140-b5f7247e9a5e-kube-api-access-9rndh\") pod \"oauth-openshift-fb6b676c8-m5bdm\" (UID: \"8c360bc7-3426-4689-9140-b5f7247e9a5e\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:09 crc kubenswrapper[4762]: I0217 17:52:09.976280 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.222981 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.401135 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.421667 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.488398 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.530105 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-m5bdm"] Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.575992 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.764501 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.839998 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 17:52:10 crc kubenswrapper[4762]: I0217 17:52:10.859171 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.045711 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fb25d5-f8ca-43c5-ae4d-31da698c4780" path="/var/lib/kubelet/pods/35fb25d5-f8ca-43c5-ae4d-31da698c4780/volumes" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.094365 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.235713 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.494386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" event={"ID":"8c360bc7-3426-4689-9140-b5f7247e9a5e","Type":"ContainerStarted","Data":"62adb00c08afd66a04cd5bccdde7429be1da9af79369462f881a75778098adca"} Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.494431 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" event={"ID":"8c360bc7-3426-4689-9140-b5f7247e9a5e","Type":"ContainerStarted","Data":"51499a775077c9d014be6f8425f43d940092bc3671cd58347e52ec5961c8a591"} Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.494817 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.503168 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" Feb 17 17:52:11 crc kubenswrapper[4762]: I0217 17:52:11.521947 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-fb6b676c8-m5bdm" podStartSLOduration=63.521923563 podStartE2EDuration="1m3.521923563s" podCreationTimestamp="2026-02-17 17:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:52:11.51580161 +0000 UTC m=+283.160719620" watchObservedRunningTime="2026-02-17 17:52:11.521923563 +0000 UTC m=+283.166841613" Feb 17 17:52:12 crc kubenswrapper[4762]: I0217 17:52:12.392302 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 17:52:12 crc kubenswrapper[4762]: I0217 17:52:12.392328 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 17:52:13 crc kubenswrapper[4762]: I0217 17:52:13.440580 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 17:52:17 crc kubenswrapper[4762]: I0217 17:52:17.453461 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:52:17 crc kubenswrapper[4762]: I0217 17:52:17.454019 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cd3dc69ae9bea17fcca6cdada676ea1fec275c9bb7d4edc03fa33e73d1a77f6c" gracePeriod=5 Feb 17 17:52:22 crc kubenswrapper[4762]: I0217 17:52:22.571043 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:52:22 crc kubenswrapper[4762]: I0217 17:52:22.571599 4762 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cd3dc69ae9bea17fcca6cdada676ea1fec275c9bb7d4edc03fa33e73d1a77f6c" exitCode=137 Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.022380 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.022469 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.044396 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.057339 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.057370 4762 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e013b3c5-5a62-4f85-a04b-0008db4bed34" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.059467 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.059503 4762 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e013b3c5-5a62-4f85-a04b-0008db4bed34" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132028 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132121 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132137 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132182 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132200 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132244 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132267 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132458 4762 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132468 4762 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.132477 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.133111 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.139819 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.233784 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.233818 4762 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.578946 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.579033 4762 scope.go:117] "RemoveContainer" containerID="cd3dc69ae9bea17fcca6cdada676ea1fec275c9bb7d4edc03fa33e73d1a77f6c" Feb 17 17:52:23 crc kubenswrapper[4762]: I0217 17:52:23.579181 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:52:25 crc kubenswrapper[4762]: I0217 17:52:25.041167 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 17:52:26 crc kubenswrapper[4762]: I0217 17:52:26.597391 4762 generic.go:334] "Generic (PLEG): container finished" podID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerID="1a0143c62ed1a62e452f6b8b766a823202c59d3d24486e467723cd1a6b4adaa2" exitCode=0 Feb 17 17:52:26 crc kubenswrapper[4762]: I0217 17:52:26.597483 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerDied","Data":"1a0143c62ed1a62e452f6b8b766a823202c59d3d24486e467723cd1a6b4adaa2"} Feb 17 17:52:26 crc kubenswrapper[4762]: I0217 17:52:26.598910 4762 scope.go:117] "RemoveContainer" containerID="1a0143c62ed1a62e452f6b8b766a823202c59d3d24486e467723cd1a6b4adaa2" Feb 17 17:52:27 crc kubenswrapper[4762]: I0217 17:52:27.610564 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerStarted","Data":"289c2d484da2064cbe834acdc3553b856255f20b383be346003cfc595597dd94"} Feb 17 17:52:27 crc kubenswrapper[4762]: I0217 17:52:27.611506 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:52:27 crc kubenswrapper[4762]: I0217 17:52:27.613929 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:52:28 crc kubenswrapper[4762]: I0217 17:52:28.839543 4762 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.027528 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.028105 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerName="controller-manager" containerID="cri-o://53a35ab7f66de6b8572565f9321b93045288c8590d4ef842fcb0ad576519eaf2" gracePeriod=30 Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.113852 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.114066 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerName="route-controller-manager" containerID="cri-o://e2e2e1ea687f9294ba50a795fce72bcee9ed632d5eb3a3a75cee69e2e7cf01b2" gracePeriod=30 Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.650160 4762 generic.go:334] "Generic (PLEG): container finished" podID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerID="53a35ab7f66de6b8572565f9321b93045288c8590d4ef842fcb0ad576519eaf2" exitCode=0 Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.650582 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" event={"ID":"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce","Type":"ContainerDied","Data":"53a35ab7f66de6b8572565f9321b93045288c8590d4ef842fcb0ad576519eaf2"} Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.651887 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" event={"ID":"3c1453fe-730e-49d9-9d85-efbfec1ca329","Type":"ContainerDied","Data":"e2e2e1ea687f9294ba50a795fce72bcee9ed632d5eb3a3a75cee69e2e7cf01b2"} Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.651870 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerID="e2e2e1ea687f9294ba50a795fce72bcee9ed632d5eb3a3a75cee69e2e7cf01b2" exitCode=0 Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.878753 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.926650 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert\") pod \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.927051 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config\") pod \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.927202 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6zv\" (UniqueName: \"kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv\") pod \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.930499 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config" (OuterVolumeSpecName: "config") pod "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" (UID: "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.947433 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv" (OuterVolumeSpecName: "kube-api-access-5m6zv") pod "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" (UID: "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce"). InnerVolumeSpecName "kube-api-access-5m6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.953293 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" (UID: "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:52:34 crc kubenswrapper[4762]: I0217 17:52:34.984353 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028136 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca\") pod \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028776 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config\") pod \"3c1453fe-730e-49d9-9d85-efbfec1ca329\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028805 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca\") pod \"3c1453fe-730e-49d9-9d85-efbfec1ca329\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028842 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles\") pod \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\" (UID: \"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028874 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert\") pod \"3c1453fe-730e-49d9-9d85-efbfec1ca329\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028939 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tprpj\" (UniqueName: \"kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj\") pod \"3c1453fe-730e-49d9-9d85-efbfec1ca329\" (UID: \"3c1453fe-730e-49d9-9d85-efbfec1ca329\") " Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.029155 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.029172 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.029181 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6zv\" (UniqueName: \"kubernetes.io/projected/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-kube-api-access-5m6zv\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.028692 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" (UID: "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.030204 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c1453fe-730e-49d9-9d85-efbfec1ca329" (UID: "3c1453fe-730e-49d9-9d85-efbfec1ca329"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.030602 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" (UID: "3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.030652 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config" (OuterVolumeSpecName: "config") pod "3c1453fe-730e-49d9-9d85-efbfec1ca329" (UID: "3c1453fe-730e-49d9-9d85-efbfec1ca329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.033103 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c1453fe-730e-49d9-9d85-efbfec1ca329" (UID: "3c1453fe-730e-49d9-9d85-efbfec1ca329"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.033205 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj" (OuterVolumeSpecName: "kube-api-access-tprpj") pod "3c1453fe-730e-49d9-9d85-efbfec1ca329" (UID: "3c1453fe-730e-49d9-9d85-efbfec1ca329"). InnerVolumeSpecName "kube-api-access-tprpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130531 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130598 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c1453fe-730e-49d9-9d85-efbfec1ca329-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130612 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130647 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1453fe-730e-49d9-9d85-efbfec1ca329-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130659 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tprpj\" (UniqueName: \"kubernetes.io/projected/3c1453fe-730e-49d9-9d85-efbfec1ca329-kube-api-access-tprpj\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.130672 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.658018 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.658006 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-htp99" event={"ID":"3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce","Type":"ContainerDied","Data":"028478e88fce412c248fafaf5105889871e25671eee4c5f15c8e01aca3a96177"} Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.658530 4762 scope.go:117] "RemoveContainer" containerID="53a35ab7f66de6b8572565f9321b93045288c8590d4ef842fcb0ad576519eaf2" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.659393 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" event={"ID":"3c1453fe-730e-49d9-9d85-efbfec1ca329","Type":"ContainerDied","Data":"8ccaee0ec0074f7c7137a012d5b5a56f01aad1f3f5e5f69266d2323616ea189e"} Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.659440 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.678070 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.678177 4762 scope.go:117] "RemoveContainer" containerID="e2e2e1ea687f9294ba50a795fce72bcee9ed632d5eb3a3a75cee69e2e7cf01b2" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.681446 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-htp99"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.687902 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.692032 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gjb9r"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721274 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7797dfb97f-z77l6"] Feb 17 17:52:35 crc kubenswrapper[4762]: E0217 17:52:35.721612 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721654 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:52:35 crc kubenswrapper[4762]: E0217 17:52:35.721678 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerName="route-controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721688 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerName="route-controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: E0217 17:52:35.721704 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerName="controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721714 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerName="controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721857 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721871 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" containerName="controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.721883 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" containerName="route-controller-manager" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.722444 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.723842 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.723965 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.724281 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.724468 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.724706 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.724796 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.725309 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.726176 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.727426 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.730078 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.730321 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.730568 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.730715 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.730820 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.733726 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7797dfb97f-z77l6"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.735924 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736750 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736799 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736854 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wggls\" (UniqueName: \"kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736885 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736909 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736933 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.736995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwr6k\" (UniqueName: \"kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.737027 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.737124 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837676 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wggls\" (UniqueName: \"kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837815 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837856 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwr6k\" (UniqueName: \"kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.837933 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.840028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.842062 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7797dfb97f-z77l6"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.842463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: E0217 17:52:35.842486 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca kube-api-access-pwr6k proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" podUID="c58495fd-7508-4f44-a16f-06209a177fa3" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.842882 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.843693 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.847576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.854759 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.856251 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.859315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wggls\" (UniqueName: \"kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls\") pod \"route-controller-manager-dbbf5946d-r95n6\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.863386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwr6k\" (UniqueName: \"kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k\") pod \"controller-manager-7797dfb97f-z77l6\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.885413 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:35 crc kubenswrapper[4762]: I0217 17:52:35.885808 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.157482 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.667205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" event={"ID":"235e6b4a-040d-47af-9560-592afd12bc4c","Type":"ContainerStarted","Data":"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94"} Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.667247 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" event={"ID":"235e6b4a-040d-47af-9560-592afd12bc4c","Type":"ContainerStarted","Data":"2548202d4951e50972f1d60a8daf3d74ad5d06ab8d057c5f460d5c0cbf772c97"} Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.667348 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" containerName="route-controller-manager" containerID="cri-o://728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94" gracePeriod=30 Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.667560 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.669018 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.678851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.707989 4762 patch_prober.go:28] interesting pod/route-controller-manager-dbbf5946d-r95n6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:45464->10.217.0.58:8443: read: connection reset by peer" start-of-body= Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.708424 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:45464->10.217.0.58:8443: read: connection reset by peer" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.847387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles\") pod \"c58495fd-7508-4f44-a16f-06209a177fa3\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.847466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwr6k\" (UniqueName: \"kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k\") pod \"c58495fd-7508-4f44-a16f-06209a177fa3\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.847489 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config\") pod \"c58495fd-7508-4f44-a16f-06209a177fa3\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.847562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert\") pod \"c58495fd-7508-4f44-a16f-06209a177fa3\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.847589 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca\") pod \"c58495fd-7508-4f44-a16f-06209a177fa3\" (UID: \"c58495fd-7508-4f44-a16f-06209a177fa3\") " Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.848604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca" (OuterVolumeSpecName: "client-ca") pod "c58495fd-7508-4f44-a16f-06209a177fa3" (UID: "c58495fd-7508-4f44-a16f-06209a177fa3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.848702 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config" (OuterVolumeSpecName: "config") pod "c58495fd-7508-4f44-a16f-06209a177fa3" (UID: "c58495fd-7508-4f44-a16f-06209a177fa3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.848734 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c58495fd-7508-4f44-a16f-06209a177fa3" (UID: "c58495fd-7508-4f44-a16f-06209a177fa3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.853943 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c58495fd-7508-4f44-a16f-06209a177fa3" (UID: "c58495fd-7508-4f44-a16f-06209a177fa3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.858308 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k" (OuterVolumeSpecName: "kube-api-access-pwr6k") pod "c58495fd-7508-4f44-a16f-06209a177fa3" (UID: "c58495fd-7508-4f44-a16f-06209a177fa3"). InnerVolumeSpecName "kube-api-access-pwr6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.949379 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c58495fd-7508-4f44-a16f-06209a177fa3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.949424 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.949435 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.949449 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwr6k\" (UniqueName: \"kubernetes.io/projected/c58495fd-7508-4f44-a16f-06209a177fa3-kube-api-access-pwr6k\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.949462 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58495fd-7508-4f44-a16f-06209a177fa3-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.997010 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-dbbf5946d-r95n6_235e6b4a-040d-47af-9560-592afd12bc4c/route-controller-manager/0.log" Feb 17 17:52:36 crc kubenswrapper[4762]: I0217 17:52:36.997078 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.045737 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1453fe-730e-49d9-9d85-efbfec1ca329" path="/var/lib/kubelet/pods/3c1453fe-730e-49d9-9d85-efbfec1ca329/volumes" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.046475 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce" path="/var/lib/kubelet/pods/3d07fe7f-b9d0-4e9d-ab69-bafb51ae62ce/volumes" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.156607 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config\") pod \"235e6b4a-040d-47af-9560-592afd12bc4c\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.156767 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wggls\" (UniqueName: \"kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls\") pod \"235e6b4a-040d-47af-9560-592afd12bc4c\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.156834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca\") pod \"235e6b4a-040d-47af-9560-592afd12bc4c\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.156910 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert\") pod \"235e6b4a-040d-47af-9560-592afd12bc4c\" (UID: \"235e6b4a-040d-47af-9560-592afd12bc4c\") " Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.157669 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "235e6b4a-040d-47af-9560-592afd12bc4c" (UID: "235e6b4a-040d-47af-9560-592afd12bc4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.157700 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config" (OuterVolumeSpecName: "config") pod "235e6b4a-040d-47af-9560-592afd12bc4c" (UID: "235e6b4a-040d-47af-9560-592afd12bc4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.160276 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls" (OuterVolumeSpecName: "kube-api-access-wggls") pod "235e6b4a-040d-47af-9560-592afd12bc4c" (UID: "235e6b4a-040d-47af-9560-592afd12bc4c"). InnerVolumeSpecName "kube-api-access-wggls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.160785 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "235e6b4a-040d-47af-9560-592afd12bc4c" (UID: "235e6b4a-040d-47af-9560-592afd12bc4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.258066 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.258118 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wggls\" (UniqueName: \"kubernetes.io/projected/235e6b4a-040d-47af-9560-592afd12bc4c-kube-api-access-wggls\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.258132 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/235e6b4a-040d-47af-9560-592afd12bc4c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.258144 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235e6b4a-040d-47af-9560-592afd12bc4c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674489 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-dbbf5946d-r95n6_235e6b4a-040d-47af-9560-592afd12bc4c/route-controller-manager/0.log" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674839 4762 generic.go:334] "Generic (PLEG): container finished" podID="235e6b4a-040d-47af-9560-592afd12bc4c" containerID="728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94" exitCode=255 Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674903 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" event={"ID":"235e6b4a-040d-47af-9560-592afd12bc4c","Type":"ContainerDied","Data":"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94"} Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674929 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6" event={"ID":"235e6b4a-040d-47af-9560-592afd12bc4c","Type":"ContainerDied","Data":"2548202d4951e50972f1d60a8daf3d74ad5d06ab8d057c5f460d5c0cbf772c97"} Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674977 4762 scope.go:117] "RemoveContainer" containerID="728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.674915 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797dfb97f-z77l6" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.697200 4762 scope.go:117] "RemoveContainer" containerID="728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94" Feb 17 17:52:37 crc kubenswrapper[4762]: E0217 17:52:37.697718 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94\": container with ID starting with 728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94 not found: ID does not exist" containerID="728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.697767 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94"} err="failed to get container status \"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94\": rpc error: code = NotFound desc = could not find container \"728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94\": container with ID starting with 728e959c9cbccf181fa1e61da3ceb2ef65388c82f5cd04feed1680e45aa64d94 not found: ID does not exist" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.709686 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7797dfb97f-z77l6"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.714714 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7797dfb97f-z77l6"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.724558 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:52:37 crc kubenswrapper[4762]: E0217 17:52:37.724828 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" containerName="route-controller-manager" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.724851 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" containerName="route-controller-manager" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.724972 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" containerName="route-controller-manager" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.725299 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.728350 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.728720 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.728791 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.730748 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.731725 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.733562 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.735910 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.736073 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.736826 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737099 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737235 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737143 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737184 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737552 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.737589 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.739209 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.742344 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbbf5946d-r95n6"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.744223 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.752185 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866370 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866449 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866493 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hgvb\" (UniqueName: \"kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866521 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2w5s\" (UniqueName: \"kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866703 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866764 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866823 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.866852 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967285 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967347 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hgvb\" (UniqueName: \"kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967379 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2w5s\" (UniqueName: \"kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967408 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967471 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967487 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.967508 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.968690 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.968911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.969072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.969814 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.969911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.972836 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.989663 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.992808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hgvb\" (UniqueName: \"kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb\") pod \"route-controller-manager-dfff9545c-m792b\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:37 crc kubenswrapper[4762]: I0217 17:52:37.993984 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2w5s\" (UniqueName: \"kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s\") pod \"controller-manager-5ff6ccf94b-gcs77\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.046922 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.060377 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.295644 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.358792 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:38 crc kubenswrapper[4762]: W0217 17:52:38.369885 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72306799_50e6_4609_92f0_cece95923211.slice/crio-e38c9e5f4ec7a4cfcd87b97cb26415e439bb1abd86f36c203b92e4787fffce08 WatchSource:0}: Error finding container e38c9e5f4ec7a4cfcd87b97cb26415e439bb1abd86f36c203b92e4787fffce08: Status 404 returned error can't find the container with id e38c9e5f4ec7a4cfcd87b97cb26415e439bb1abd86f36c203b92e4787fffce08 Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.680261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" event={"ID":"72306799-50e6-4609-92f0-cece95923211","Type":"ContainerStarted","Data":"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429"} Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.680597 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.680609 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" event={"ID":"72306799-50e6-4609-92f0-cece95923211","Type":"ContainerStarted","Data":"e38c9e5f4ec7a4cfcd87b97cb26415e439bb1abd86f36c203b92e4787fffce08"} Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.684514 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" event={"ID":"bd19545c-53f3-4b81-8e8f-4293cd706247","Type":"ContainerStarted","Data":"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb"} Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.684541 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" event={"ID":"bd19545c-53f3-4b81-8e8f-4293cd706247","Type":"ContainerStarted","Data":"5dc08bfa83203c274815ba506890fd9e6eeef93245fa53c92c55b04987ea1dde"} Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.684737 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.688826 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:52:38 crc kubenswrapper[4762]: I0217 17:52:38.700803 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" podStartSLOduration=2.700779659 podStartE2EDuration="2.700779659s" podCreationTimestamp="2026-02-17 17:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:52:38.698272964 +0000 UTC m=+310.343190974" watchObservedRunningTime="2026-02-17 17:52:38.700779659 +0000 UTC m=+310.345697669" Feb 17 17:52:39 crc kubenswrapper[4762]: I0217 17:52:39.001107 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:39 crc kubenswrapper[4762]: I0217 17:52:39.019068 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" podStartSLOduration=3.01905029 podStartE2EDuration="3.01905029s" podCreationTimestamp="2026-02-17 17:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:52:38.717548023 +0000 UTC m=+310.362466033" watchObservedRunningTime="2026-02-17 17:52:39.01905029 +0000 UTC m=+310.663968300" Feb 17 17:52:39 crc kubenswrapper[4762]: I0217 17:52:39.041468 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235e6b4a-040d-47af-9560-592afd12bc4c" path="/var/lib/kubelet/pods/235e6b4a-040d-47af-9560-592afd12bc4c/volumes" Feb 17 17:52:39 crc kubenswrapper[4762]: I0217 17:52:39.042031 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58495fd-7508-4f44-a16f-06209a177fa3" path="/var/lib/kubelet/pods/c58495fd-7508-4f44-a16f-06209a177fa3/volumes" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.003544 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.004392 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" podUID="72306799-50e6-4609-92f0-cece95923211" containerName="route-controller-manager" containerID="cri-o://2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429" gracePeriod=30 Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.418504 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.583897 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert\") pod \"72306799-50e6-4609-92f0-cece95923211\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.584059 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca\") pod \"72306799-50e6-4609-92f0-cece95923211\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.585141 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca" (OuterVolumeSpecName: "client-ca") pod "72306799-50e6-4609-92f0-cece95923211" (UID: "72306799-50e6-4609-92f0-cece95923211"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.585261 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hgvb\" (UniqueName: \"kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb\") pod \"72306799-50e6-4609-92f0-cece95923211\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.585317 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config\") pod \"72306799-50e6-4609-92f0-cece95923211\" (UID: \"72306799-50e6-4609-92f0-cece95923211\") " Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.585790 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.586311 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config" (OuterVolumeSpecName: "config") pod "72306799-50e6-4609-92f0-cece95923211" (UID: "72306799-50e6-4609-92f0-cece95923211"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.589483 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "72306799-50e6-4609-92f0-cece95923211" (UID: "72306799-50e6-4609-92f0-cece95923211"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.593914 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb" (OuterVolumeSpecName: "kube-api-access-9hgvb") pod "72306799-50e6-4609-92f0-cece95923211" (UID: "72306799-50e6-4609-92f0-cece95923211"). InnerVolumeSpecName "kube-api-access-9hgvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.687170 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hgvb\" (UniqueName: \"kubernetes.io/projected/72306799-50e6-4609-92f0-cece95923211-kube-api-access-9hgvb\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.687201 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72306799-50e6-4609-92f0-cece95923211-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.687210 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72306799-50e6-4609-92f0-cece95923211-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.771548 4762 generic.go:334] "Generic (PLEG): container finished" podID="72306799-50e6-4609-92f0-cece95923211" containerID="2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429" exitCode=0 Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.771588 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" event={"ID":"72306799-50e6-4609-92f0-cece95923211","Type":"ContainerDied","Data":"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429"} Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.771612 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" event={"ID":"72306799-50e6-4609-92f0-cece95923211","Type":"ContainerDied","Data":"e38c9e5f4ec7a4cfcd87b97cb26415e439bb1abd86f36c203b92e4787fffce08"} Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.771644 4762 scope.go:117] "RemoveContainer" containerID="2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.771645 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.787105 4762 scope.go:117] "RemoveContainer" containerID="2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429" Feb 17 17:52:54 crc kubenswrapper[4762]: E0217 17:52:54.787474 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429\": container with ID starting with 2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429 not found: ID does not exist" containerID="2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.787514 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429"} err="failed to get container status \"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429\": rpc error: code = NotFound desc = could not find container \"2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429\": container with ID starting with 2d90d3aed4a84b36307b5832a3d433bb75982cfd28882ff77faeea188a56c429 not found: ID does not exist" Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.795358 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:54 crc kubenswrapper[4762]: I0217 17:52:54.798743 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfff9545c-m792b"] Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.044086 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72306799-50e6-4609-92f0-cece95923211" path="/var/lib/kubelet/pods/72306799-50e6-4609-92f0-cece95923211/volumes" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.741155 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv"] Feb 17 17:52:55 crc kubenswrapper[4762]: E0217 17:52:55.741841 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72306799-50e6-4609-92f0-cece95923211" containerName="route-controller-manager" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.741869 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="72306799-50e6-4609-92f0-cece95923211" containerName="route-controller-manager" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.742072 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="72306799-50e6-4609-92f0-cece95923211" containerName="route-controller-manager" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.742536 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.746507 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.746745 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.747030 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.747202 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.747353 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.747891 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.754599 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv"] Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.900997 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/412a4a82-0291-428a-824d-590abfbe9a6f-serving-cert\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.901348 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-client-ca\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.901450 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24z96\" (UniqueName: \"kubernetes.io/projected/412a4a82-0291-428a-824d-590abfbe9a6f-kube-api-access-24z96\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:55 crc kubenswrapper[4762]: I0217 17:52:55.901540 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-config\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.003555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-client-ca\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.004000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z96\" (UniqueName: \"kubernetes.io/projected/412a4a82-0291-428a-824d-590abfbe9a6f-kube-api-access-24z96\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.004236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-config\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.004526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/412a4a82-0291-428a-824d-590abfbe9a6f-serving-cert\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.004932 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-client-ca\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.006519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412a4a82-0291-428a-824d-590abfbe9a6f-config\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.022524 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/412a4a82-0291-428a-824d-590abfbe9a6f-serving-cert\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.028208 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24z96\" (UniqueName: \"kubernetes.io/projected/412a4a82-0291-428a-824d-590abfbe9a6f-kube-api-access-24z96\") pod \"route-controller-manager-6f7495f4b4-wrxvv\" (UID: \"412a4a82-0291-428a-824d-590abfbe9a6f\") " pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.100662 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.533514 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv"] Feb 17 17:52:56 crc kubenswrapper[4762]: W0217 17:52:56.536924 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412a4a82_0291_428a_824d_590abfbe9a6f.slice/crio-be5b01afcdda7686d59f955dfcd9cace9ddd232c9703872264eb7b631f7532c1 WatchSource:0}: Error finding container be5b01afcdda7686d59f955dfcd9cace9ddd232c9703872264eb7b631f7532c1: Status 404 returned error can't find the container with id be5b01afcdda7686d59f955dfcd9cace9ddd232c9703872264eb7b631f7532c1 Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.801045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" event={"ID":"412a4a82-0291-428a-824d-590abfbe9a6f","Type":"ContainerStarted","Data":"97076cdf65087874403d3f7fdf68ce1df2b0e239e454b09255a8eecbbb3928b6"} Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.801093 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" event={"ID":"412a4a82-0291-428a-824d-590abfbe9a6f","Type":"ContainerStarted","Data":"be5b01afcdda7686d59f955dfcd9cace9ddd232c9703872264eb7b631f7532c1"} Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.801424 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:52:56 crc kubenswrapper[4762]: I0217 17:52:56.826046 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" podStartSLOduration=2.826020476 podStartE2EDuration="2.826020476s" podCreationTimestamp="2026-02-17 17:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:52:56.823057877 +0000 UTC m=+328.467975887" watchObservedRunningTime="2026-02-17 17:52:56.826020476 +0000 UTC m=+328.470938506" Feb 17 17:52:57 crc kubenswrapper[4762]: I0217 17:52:57.115163 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f7495f4b4-wrxvv" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.526789 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7cztq"] Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.528423 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.544011 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7cztq"] Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.666861 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvw7f\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-kube-api-access-hvw7f\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.666999 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667033 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-bound-sa-token\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667055 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-tls\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667085 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667191 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-certificates\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-trusted-ca\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.667270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.697800 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768656 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvw7f\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-kube-api-access-hvw7f\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-bound-sa-token\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768766 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-tls\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768830 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-certificates\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.768860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-trusted-ca\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.770409 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.771252 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-certificates\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.771839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-trusted-ca\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.783797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-registry-tls\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.784267 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.789112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-bound-sa-token\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.792865 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvw7f\" (UniqueName: \"kubernetes.io/projected/fedc64f9-b5ed-426c-bb3e-21eff088fb3e-kube-api-access-hvw7f\") pod \"image-registry-66df7c8f76-7cztq\" (UID: \"fedc64f9-b5ed-426c-bb3e-21eff088fb3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:25 crc kubenswrapper[4762]: I0217 17:53:25.847077 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.267809 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7cztq"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.653812 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.654058 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zfghh" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="registry-server" containerID="cri-o://090d7a1886c00978a221f82900cdc10774825743f5c2f80e6e946d59b359601b" gracePeriod=30 Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.667935 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.668260 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jnlvk" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="registry-server" containerID="cri-o://ebf9a27c2db6c94ac0f551cb66113404401f158ea4c94a05c31955b9bed29539" gracePeriod=30 Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.684108 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.684370 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" containerID="cri-o://289c2d484da2064cbe834acdc3553b856255f20b383be346003cfc595597dd94" gracePeriod=30 Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.690666 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.690944 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fmbwb" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="registry-server" containerID="cri-o://c5cbaae27108ad4ba815a202093e3ad495655b0875b89ed1c31598e8ab418dee" gracePeriod=30 Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.699725 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.700048 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jq9qr" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="registry-server" containerID="cri-o://9dcdd00effdadf1d349ce93fb8d2a98e3aa4612aeb337b6f07bdd3f76796e97f" gracePeriod=30 Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.703803 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mh4k"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.706665 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.717558 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mh4k"] Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.881453 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.881882 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rl62\" (UniqueName: \"kubernetes.io/projected/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-kube-api-access-6rl62\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.881918 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.983204 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.983298 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rl62\" (UniqueName: \"kubernetes.io/projected/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-kube-api-access-6rl62\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.983331 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.985736 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:26 crc kubenswrapper[4762]: I0217 17:53:26.990163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.000185 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rl62\" (UniqueName: \"kubernetes.io/projected/ae055f49-1dcf-4008-85fe-2f3ca1d45a75-kube-api-access-6rl62\") pod \"marketplace-operator-79b997595-4mh4k\" (UID: \"ae055f49-1dcf-4008-85fe-2f3ca1d45a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.015071 4762 generic.go:334] "Generic (PLEG): container finished" podID="a0697342-ade9-480a-9ac9-074416d620ef" containerID="ebf9a27c2db6c94ac0f551cb66113404401f158ea4c94a05c31955b9bed29539" exitCode=0 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.015112 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerDied","Data":"ebf9a27c2db6c94ac0f551cb66113404401f158ea4c94a05c31955b9bed29539"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.017762 4762 generic.go:334] "Generic (PLEG): container finished" podID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerID="9dcdd00effdadf1d349ce93fb8d2a98e3aa4612aeb337b6f07bdd3f76796e97f" exitCode=0 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.017827 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerDied","Data":"9dcdd00effdadf1d349ce93fb8d2a98e3aa4612aeb337b6f07bdd3f76796e97f"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.020060 4762 generic.go:334] "Generic (PLEG): container finished" podID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerID="c5cbaae27108ad4ba815a202093e3ad495655b0875b89ed1c31598e8ab418dee" exitCode=0 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.020116 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerDied","Data":"c5cbaae27108ad4ba815a202093e3ad495655b0875b89ed1c31598e8ab418dee"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.021685 4762 generic.go:334] "Generic (PLEG): container finished" podID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerID="289c2d484da2064cbe834acdc3553b856255f20b383be346003cfc595597dd94" exitCode=0 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.021743 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerDied","Data":"289c2d484da2064cbe834acdc3553b856255f20b383be346003cfc595597dd94"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.021770 4762 scope.go:117] "RemoveContainer" containerID="1a0143c62ed1a62e452f6b8b766a823202c59d3d24486e467723cd1a6b4adaa2" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.023363 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerID="090d7a1886c00978a221f82900cdc10774825743f5c2f80e6e946d59b359601b" exitCode=0 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.023412 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerDied","Data":"090d7a1886c00978a221f82900cdc10774825743f5c2f80e6e946d59b359601b"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.023431 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zfghh" event={"ID":"ae06034f-323c-4a19-95bb-ba8c21fda464","Type":"ContainerDied","Data":"7addc00f060e7698fc1eee97822d05d624f0f3708d34283f09d46c4af3fb062f"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.023443 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7addc00f060e7698fc1eee97822d05d624f0f3708d34283f09d46c4af3fb062f" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.024455 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" event={"ID":"fedc64f9-b5ed-426c-bb3e-21eff088fb3e","Type":"ContainerStarted","Data":"d39e03516f7bdc24934a477d56e76fa42d5997ec86c3b9083dce6dfb4e0c75e3"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.024479 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" event={"ID":"fedc64f9-b5ed-426c-bb3e-21eff088fb3e","Type":"ContainerStarted","Data":"0bdca232851ce7af9069b3f8c22cb6f8f0a571b8e65843def821d85970d9c947"} Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.025256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.030572 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.056770 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" podStartSLOduration=2.056751567 podStartE2EDuration="2.056751567s" podCreationTimestamp="2026-02-17 17:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:53:27.056143558 +0000 UTC m=+358.701061568" watchObservedRunningTime="2026-02-17 17:53:27.056751567 +0000 UTC m=+358.701669587" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.117804 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.149027 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.198085 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.203340 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.213630 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288078 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58wf\" (UniqueName: \"kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf\") pod \"a0697342-ade9-480a-9ac9-074416d620ef\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities\") pod \"ae06034f-323c-4a19-95bb-ba8c21fda464\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288156 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities\") pod \"a0697342-ade9-480a-9ac9-074416d620ef\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288182 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvbdb\" (UniqueName: \"kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb\") pod \"ae06034f-323c-4a19-95bb-ba8c21fda464\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288225 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content\") pod \"ae06034f-323c-4a19-95bb-ba8c21fda464\" (UID: \"ae06034f-323c-4a19-95bb-ba8c21fda464\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.288255 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content\") pod \"a0697342-ade9-480a-9ac9-074416d620ef\" (UID: \"a0697342-ade9-480a-9ac9-074416d620ef\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.289071 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities" (OuterVolumeSpecName: "utilities") pod "ae06034f-323c-4a19-95bb-ba8c21fda464" (UID: "ae06034f-323c-4a19-95bb-ba8c21fda464"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.289193 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities" (OuterVolumeSpecName: "utilities") pod "a0697342-ade9-480a-9ac9-074416d620ef" (UID: "a0697342-ade9-480a-9ac9-074416d620ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.289290 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.289306 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.291944 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf" (OuterVolumeSpecName: "kube-api-access-k58wf") pod "a0697342-ade9-480a-9ac9-074416d620ef" (UID: "a0697342-ade9-480a-9ac9-074416d620ef"). InnerVolumeSpecName "kube-api-access-k58wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.292315 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb" (OuterVolumeSpecName: "kube-api-access-rvbdb") pod "ae06034f-323c-4a19-95bb-ba8c21fda464" (UID: "ae06034f-323c-4a19-95bb-ba8c21fda464"). InnerVolumeSpecName "kube-api-access-rvbdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.349952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0697342-ade9-480a-9ac9-074416d620ef" (UID: "a0697342-ade9-480a-9ac9-074416d620ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.356185 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae06034f-323c-4a19-95bb-ba8c21fda464" (UID: "ae06034f-323c-4a19-95bb-ba8c21fda464"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390048 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities\") pod \"1ff10a6d-758d-44f1-bc36-f2843c20401c\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390131 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829rm\" (UniqueName: \"kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm\") pod \"625b741e-9e06-4f4d-a143-8a576c59eb70\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390172 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics\") pod \"2d3444be-9dcc-4072-9735-120bfeaa36aa\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390200 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts6nb\" (UniqueName: \"kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb\") pod \"2d3444be-9dcc-4072-9735-120bfeaa36aa\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390220 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content\") pod \"625b741e-9e06-4f4d-a143-8a576c59eb70\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390280 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca\") pod \"2d3444be-9dcc-4072-9735-120bfeaa36aa\" (UID: \"2d3444be-9dcc-4072-9735-120bfeaa36aa\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390334 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities\") pod \"625b741e-9e06-4f4d-a143-8a576c59eb70\" (UID: \"625b741e-9e06-4f4d-a143-8a576c59eb70\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390352 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ktv\" (UniqueName: \"kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv\") pod \"1ff10a6d-758d-44f1-bc36-f2843c20401c\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390407 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content\") pod \"1ff10a6d-758d-44f1-bc36-f2843c20401c\" (UID: \"1ff10a6d-758d-44f1-bc36-f2843c20401c\") " Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390685 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae06034f-323c-4a19-95bb-ba8c21fda464-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390699 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0697342-ade9-480a-9ac9-074416d620ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390730 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58wf\" (UniqueName: \"kubernetes.io/projected/a0697342-ade9-480a-9ac9-074416d620ef-kube-api-access-k58wf\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390740 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvbdb\" (UniqueName: \"kubernetes.io/projected/ae06034f-323c-4a19-95bb-ba8c21fda464-kube-api-access-rvbdb\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.390930 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities" (OuterVolumeSpecName: "utilities") pod "1ff10a6d-758d-44f1-bc36-f2843c20401c" (UID: "1ff10a6d-758d-44f1-bc36-f2843c20401c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.400362 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2d3444be-9dcc-4072-9735-120bfeaa36aa" (UID: "2d3444be-9dcc-4072-9735-120bfeaa36aa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.400614 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb" (OuterVolumeSpecName: "kube-api-access-ts6nb") pod "2d3444be-9dcc-4072-9735-120bfeaa36aa" (UID: "2d3444be-9dcc-4072-9735-120bfeaa36aa"). InnerVolumeSpecName "kube-api-access-ts6nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.400920 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities" (OuterVolumeSpecName: "utilities") pod "625b741e-9e06-4f4d-a143-8a576c59eb70" (UID: "625b741e-9e06-4f4d-a143-8a576c59eb70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.403187 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2d3444be-9dcc-4072-9735-120bfeaa36aa" (UID: "2d3444be-9dcc-4072-9735-120bfeaa36aa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.405335 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm" (OuterVolumeSpecName: "kube-api-access-829rm") pod "625b741e-9e06-4f4d-a143-8a576c59eb70" (UID: "625b741e-9e06-4f4d-a143-8a576c59eb70"). InnerVolumeSpecName "kube-api-access-829rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.410895 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv" (OuterVolumeSpecName: "kube-api-access-28ktv") pod "1ff10a6d-758d-44f1-bc36-f2843c20401c" (UID: "1ff10a6d-758d-44f1-bc36-f2843c20401c"). InnerVolumeSpecName "kube-api-access-28ktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.437506 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ff10a6d-758d-44f1-bc36-f2843c20401c" (UID: "1ff10a6d-758d-44f1-bc36-f2843c20401c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493272 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493346 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts6nb\" (UniqueName: \"kubernetes.io/projected/2d3444be-9dcc-4072-9735-120bfeaa36aa-kube-api-access-ts6nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493360 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3444be-9dcc-4072-9735-120bfeaa36aa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493373 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493386 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ktv\" (UniqueName: \"kubernetes.io/projected/1ff10a6d-758d-44f1-bc36-f2843c20401c-kube-api-access-28ktv\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493399 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493409 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff10a6d-758d-44f1-bc36-f2843c20401c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.493423 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829rm\" (UniqueName: \"kubernetes.io/projected/625b741e-9e06-4f4d-a143-8a576c59eb70-kube-api-access-829rm\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.496053 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mh4k"] Feb 17 17:53:27 crc kubenswrapper[4762]: W0217 17:53:27.503949 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae055f49_1dcf_4008_85fe_2f3ca1d45a75.slice/crio-d3611e998bcd062b4bbccc345ab216d4830bfb862ea1b87cb17ecd9ac7b73498 WatchSource:0}: Error finding container d3611e998bcd062b4bbccc345ab216d4830bfb862ea1b87cb17ecd9ac7b73498: Status 404 returned error can't find the container with id d3611e998bcd062b4bbccc345ab216d4830bfb862ea1b87cb17ecd9ac7b73498 Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.539027 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "625b741e-9e06-4f4d-a143-8a576c59eb70" (UID: "625b741e-9e06-4f4d-a143-8a576c59eb70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:53:27 crc kubenswrapper[4762]: I0217 17:53:27.594494 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/625b741e-9e06-4f4d-a143-8a576c59eb70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.031416 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnlvk" event={"ID":"a0697342-ade9-480a-9ac9-074416d620ef","Type":"ContainerDied","Data":"fdfb4f69a5c7e02e4e931f5d83eebf18bda30fe70e127059e07dce3fa7ea7d20"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.031487 4762 scope.go:117] "RemoveContainer" containerID="ebf9a27c2db6c94ac0f551cb66113404401f158ea4c94a05c31955b9bed29539" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.031438 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnlvk" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.035784 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq9qr" event={"ID":"625b741e-9e06-4f4d-a143-8a576c59eb70","Type":"ContainerDied","Data":"efaebd4160cef745bbc3d3f9b1a7cc6c5b222fadc5cc8c3e004af40fd624c74b"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.036021 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq9qr" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.037570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" event={"ID":"ae055f49-1dcf-4008-85fe-2f3ca1d45a75","Type":"ContainerStarted","Data":"4c846f753b5fc27883519bda964bf8fcb94c9b6d69e2f7503a9a59b8df62093a"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.037647 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" event={"ID":"ae055f49-1dcf-4008-85fe-2f3ca1d45a75","Type":"ContainerStarted","Data":"d3611e998bcd062b4bbccc345ab216d4830bfb862ea1b87cb17ecd9ac7b73498"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.037670 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.042082 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmbwb" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.044324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmbwb" event={"ID":"1ff10a6d-758d-44f1-bc36-f2843c20401c","Type":"ContainerDied","Data":"4e1d3ac28bf587f09158284f2be46f6f28801d663a60a66cd7eede0f363a7279"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.044404 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.047789 4762 scope.go:117] "RemoveContainer" containerID="93b51f809932e401efbccc816311187c7bb2de250bd0d63898e4a7fbcc395ad9" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.049491 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" event={"ID":"2d3444be-9dcc-4072-9735-120bfeaa36aa","Type":"ContainerDied","Data":"6e71008c70a4a148e11f669194a4876c1e37a9077f104ae176f93ed92f9b2ec4"} Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.049503 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zfghh" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.050946 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gktn" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.059003 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4mh4k" podStartSLOduration=2.058986842 podStartE2EDuration="2.058986842s" podCreationTimestamp="2026-02-17 17:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:53:28.055545279 +0000 UTC m=+359.700463289" watchObservedRunningTime="2026-02-17 17:53:28.058986842 +0000 UTC m=+359.703904852" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.066533 4762 scope.go:117] "RemoveContainer" containerID="bf9e30eac18fa99baf62c0ac945a097e4770c92ce3f7a92a21342329fbb44a8d" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.085039 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.089864 4762 scope.go:117] "RemoveContainer" containerID="9dcdd00effdadf1d349ce93fb8d2a98e3aa4612aeb337b6f07bdd3f76796e97f" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.090842 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jnlvk"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.126853 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.127509 4762 scope.go:117] "RemoveContainer" containerID="83e03edf99f0f22e0ce37174107eccc07ac400b1839aa52f369039f2afec1d7d" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.130071 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gktn"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.141813 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.156115 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zfghh"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.157304 4762 scope.go:117] "RemoveContainer" containerID="ec139cdcb22d72a50ae1252262a6b3428847883c4a765f1f52cfb241145e9eab" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.161339 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.168100 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jq9qr"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.170640 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.173966 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmbwb"] Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.180053 4762 scope.go:117] "RemoveContainer" containerID="c5cbaae27108ad4ba815a202093e3ad495655b0875b89ed1c31598e8ab418dee" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.196747 4762 scope.go:117] "RemoveContainer" containerID="aa754b1de96634d012912429bb5011050977887a3b025cf46c997c6256418552" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.208338 4762 scope.go:117] "RemoveContainer" containerID="ff01d2d6dad4ff3c81083ed5a1e8fa149fd989ba191b5471b226a63ee4e54db1" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.219520 4762 scope.go:117] "RemoveContainer" containerID="289c2d484da2064cbe834acdc3553b856255f20b383be346003cfc595597dd94" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869060 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69hrp"] Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869311 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869326 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869339 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869347 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869360 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869368 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869377 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869385 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869396 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869403 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869413 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869421 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869431 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869439 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869450 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869457 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869466 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869474 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869485 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869493 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869500 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869509 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869518 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869525 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869536 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869543 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="extract-utilities" Feb 17 17:53:28 crc kubenswrapper[4762]: E0217 17:53:28.869554 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869561 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="extract-content" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869688 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869701 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869718 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869728 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869739 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" containerName="marketplace-operator" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.869749 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0697342-ade9-480a-9ac9-074416d620ef" containerName="registry-server" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.870655 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.872704 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:53:28 crc kubenswrapper[4762]: I0217 17:53:28.877962 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69hrp"] Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.012340 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-catalog-content\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.012403 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cb7p\" (UniqueName: \"kubernetes.io/projected/8a52b4d8-7eba-4af4-850d-565a3136fc8c-kube-api-access-5cb7p\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.012438 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-utilities\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.055167 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff10a6d-758d-44f1-bc36-f2843c20401c" path="/var/lib/kubelet/pods/1ff10a6d-758d-44f1-bc36-f2843c20401c/volumes" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.055820 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3444be-9dcc-4072-9735-120bfeaa36aa" path="/var/lib/kubelet/pods/2d3444be-9dcc-4072-9735-120bfeaa36aa/volumes" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.057227 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625b741e-9e06-4f4d-a143-8a576c59eb70" path="/var/lib/kubelet/pods/625b741e-9e06-4f4d-a143-8a576c59eb70/volumes" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.059670 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0697342-ade9-480a-9ac9-074416d620ef" path="/var/lib/kubelet/pods/a0697342-ade9-480a-9ac9-074416d620ef/volumes" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.061763 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae06034f-323c-4a19-95bb-ba8c21fda464" path="/var/lib/kubelet/pods/ae06034f-323c-4a19-95bb-ba8c21fda464/volumes" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.074226 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbswz"] Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.079040 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.079345 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbswz"] Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.088560 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.113573 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-catalog-content\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.113884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cb7p\" (UniqueName: \"kubernetes.io/projected/8a52b4d8-7eba-4af4-850d-565a3136fc8c-kube-api-access-5cb7p\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.113951 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-utilities\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.114322 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-catalog-content\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.114353 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a52b4d8-7eba-4af4-850d-565a3136fc8c-utilities\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.131188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cb7p\" (UniqueName: \"kubernetes.io/projected/8a52b4d8-7eba-4af4-850d-565a3136fc8c-kube-api-access-5cb7p\") pod \"redhat-marketplace-69hrp\" (UID: \"8a52b4d8-7eba-4af4-850d-565a3136fc8c\") " pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.202248 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.209809 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.214879 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbswl\" (UniqueName: \"kubernetes.io/projected/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-kube-api-access-vbswl\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.214916 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-catalog-content\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.215038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-utilities\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.316530 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-utilities\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.316905 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbswl\" (UniqueName: \"kubernetes.io/projected/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-kube-api-access-vbswl\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.316932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-catalog-content\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.317498 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-catalog-content\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.317759 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-utilities\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.345208 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbswl\" (UniqueName: \"kubernetes.io/projected/45a1e640-3aeb-47f7-8a26-a578cf7d7c18-kube-api-access-vbswl\") pod \"certified-operators-wbswz\" (UID: \"45a1e640-3aeb-47f7-8a26-a578cf7d7c18\") " pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.396906 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.606823 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69hrp"] Feb 17 17:53:29 crc kubenswrapper[4762]: W0217 17:53:29.608791 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a52b4d8_7eba_4af4_850d_565a3136fc8c.slice/crio-399f4374de78803fff182cd497d6836d34aaa5207e0be83abb46e3e600315ff0 WatchSource:0}: Error finding container 399f4374de78803fff182cd497d6836d34aaa5207e0be83abb46e3e600315ff0: Status 404 returned error can't find the container with id 399f4374de78803fff182cd497d6836d34aaa5207e0be83abb46e3e600315ff0 Feb 17 17:53:29 crc kubenswrapper[4762]: I0217 17:53:29.764819 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbswz"] Feb 17 17:53:29 crc kubenswrapper[4762]: W0217 17:53:29.814368 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a1e640_3aeb_47f7_8a26_a578cf7d7c18.slice/crio-d62662b26836c912490f2dcc89557a8918c22a7b121a00fce136a5bbde1cfc44 WatchSource:0}: Error finding container d62662b26836c912490f2dcc89557a8918c22a7b121a00fce136a5bbde1cfc44: Status 404 returned error can't find the container with id d62662b26836c912490f2dcc89557a8918c22a7b121a00fce136a5bbde1cfc44 Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.076934 4762 generic.go:334] "Generic (PLEG): container finished" podID="45a1e640-3aeb-47f7-8a26-a578cf7d7c18" containerID="243f9e5fb3b86eb3caa68fae3854d299e494b05df0f1d1b3c2652a5e03b2d1f5" exitCode=0 Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.077015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbswz" event={"ID":"45a1e640-3aeb-47f7-8a26-a578cf7d7c18","Type":"ContainerDied","Data":"243f9e5fb3b86eb3caa68fae3854d299e494b05df0f1d1b3c2652a5e03b2d1f5"} Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.077047 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbswz" event={"ID":"45a1e640-3aeb-47f7-8a26-a578cf7d7c18","Type":"ContainerStarted","Data":"d62662b26836c912490f2dcc89557a8918c22a7b121a00fce136a5bbde1cfc44"} Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.080569 4762 generic.go:334] "Generic (PLEG): container finished" podID="8a52b4d8-7eba-4af4-850d-565a3136fc8c" containerID="1150ff58939e563d5c5a1ec7a16ef9d2cc27938917eb9d44c773cbc35e53d60e" exitCode=0 Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.080903 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69hrp" event={"ID":"8a52b4d8-7eba-4af4-850d-565a3136fc8c","Type":"ContainerDied","Data":"1150ff58939e563d5c5a1ec7a16ef9d2cc27938917eb9d44c773cbc35e53d60e"} Feb 17 17:53:30 crc kubenswrapper[4762]: I0217 17:53:30.080962 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69hrp" event={"ID":"8a52b4d8-7eba-4af4-850d-565a3136fc8c","Type":"ContainerStarted","Data":"399f4374de78803fff182cd497d6836d34aaa5207e0be83abb46e3e600315ff0"} Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.086458 4762 generic.go:334] "Generic (PLEG): container finished" podID="8a52b4d8-7eba-4af4-850d-565a3136fc8c" containerID="711ef36f5d85e907b97f8ab820ae33db3c87c07705b05b743297666780f7b703" exitCode=0 Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.086557 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69hrp" event={"ID":"8a52b4d8-7eba-4af4-850d-565a3136fc8c","Type":"ContainerDied","Data":"711ef36f5d85e907b97f8ab820ae33db3c87c07705b05b743297666780f7b703"} Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.093014 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbswz" event={"ID":"45a1e640-3aeb-47f7-8a26-a578cf7d7c18","Type":"ContainerStarted","Data":"f0a9fead6a71028b8b9c4bff51c3c0c690d8d31a6b806b4fb596896e272dfa26"} Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.267270 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7bzhc"] Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.268298 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.270419 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.278349 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bzhc"] Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.443531 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6wh\" (UniqueName: \"kubernetes.io/projected/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-kube-api-access-7j6wh\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.443944 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-catalog-content\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.443975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-utilities\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.470955 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8gcnq"] Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.473730 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.476512 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.480544 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gcnq"] Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.545286 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-catalog-content\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.545323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-utilities\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.545393 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6wh\" (UniqueName: \"kubernetes.io/projected/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-kube-api-access-7j6wh\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.545961 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-catalog-content\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.546026 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-utilities\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.563769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6wh\" (UniqueName: \"kubernetes.io/projected/18a63ac5-9c0b-4b15-96ea-7bb2d166525e-kube-api-access-7j6wh\") pod \"community-operators-7bzhc\" (UID: \"18a63ac5-9c0b-4b15-96ea-7bb2d166525e\") " pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.584394 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.649804 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-utilities\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.649881 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-catalog-content\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.649939 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb284\" (UniqueName: \"kubernetes.io/projected/209bf713-7d49-4554-96bd-4922d360dbe7-kube-api-access-tb284\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.751510 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-utilities\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.751946 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-catalog-content\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.752032 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb284\" (UniqueName: \"kubernetes.io/projected/209bf713-7d49-4554-96bd-4922d360dbe7-kube-api-access-tb284\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.752137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-utilities\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.752371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209bf713-7d49-4554-96bd-4922d360dbe7-catalog-content\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.768320 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb284\" (UniqueName: \"kubernetes.io/projected/209bf713-7d49-4554-96bd-4922d360dbe7-kube-api-access-tb284\") pod \"redhat-operators-8gcnq\" (UID: \"209bf713-7d49-4554-96bd-4922d360dbe7\") " pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.840332 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:31 crc kubenswrapper[4762]: I0217 17:53:31.961102 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bzhc"] Feb 17 17:53:31 crc kubenswrapper[4762]: W0217 17:53:31.965873 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a63ac5_9c0b_4b15_96ea_7bb2d166525e.slice/crio-5af60832385d390dbd9aa7c229987273c165a76eeefbc9c5161837f1fb3e6b53 WatchSource:0}: Error finding container 5af60832385d390dbd9aa7c229987273c165a76eeefbc9c5161837f1fb3e6b53: Status 404 returned error can't find the container with id 5af60832385d390dbd9aa7c229987273c165a76eeefbc9c5161837f1fb3e6b53 Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.098773 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerStarted","Data":"42dc397e06b0e04106bedfcf1519d1af7cacf9492951f57d23611f7e9c1cc675"} Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.098831 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerStarted","Data":"5af60832385d390dbd9aa7c229987273c165a76eeefbc9c5161837f1fb3e6b53"} Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.103552 4762 generic.go:334] "Generic (PLEG): container finished" podID="45a1e640-3aeb-47f7-8a26-a578cf7d7c18" containerID="f0a9fead6a71028b8b9c4bff51c3c0c690d8d31a6b806b4fb596896e272dfa26" exitCode=0 Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.103630 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbswz" event={"ID":"45a1e640-3aeb-47f7-8a26-a578cf7d7c18","Type":"ContainerDied","Data":"f0a9fead6a71028b8b9c4bff51c3c0c690d8d31a6b806b4fb596896e272dfa26"} Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.106465 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69hrp" event={"ID":"8a52b4d8-7eba-4af4-850d-565a3136fc8c","Type":"ContainerStarted","Data":"31a1018aedc125cd7fce6c5417de9287f5c62e26e34d78b6654235103638dcf8"} Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.141654 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69hrp" podStartSLOduration=2.753741618 podStartE2EDuration="4.141637926s" podCreationTimestamp="2026-02-17 17:53:28 +0000 UTC" firstStartedPulling="2026-02-17 17:53:30.08269736 +0000 UTC m=+361.727615370" lastFinishedPulling="2026-02-17 17:53:31.470593668 +0000 UTC m=+363.115511678" observedRunningTime="2026-02-17 17:53:32.139906534 +0000 UTC m=+363.784824544" watchObservedRunningTime="2026-02-17 17:53:32.141637926 +0000 UTC m=+363.786555936" Feb 17 17:53:32 crc kubenswrapper[4762]: I0217 17:53:32.236683 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gcnq"] Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.112719 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a63ac5-9c0b-4b15-96ea-7bb2d166525e" containerID="42dc397e06b0e04106bedfcf1519d1af7cacf9492951f57d23611f7e9c1cc675" exitCode=0 Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.112843 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerDied","Data":"42dc397e06b0e04106bedfcf1519d1af7cacf9492951f57d23611f7e9c1cc675"} Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.115532 4762 generic.go:334] "Generic (PLEG): container finished" podID="209bf713-7d49-4554-96bd-4922d360dbe7" containerID="e2119beb5fd9cc678df7662c43bc05ff4ff4baa883544c8e31126276a5d4f20d" exitCode=0 Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.115665 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gcnq" event={"ID":"209bf713-7d49-4554-96bd-4922d360dbe7","Type":"ContainerDied","Data":"e2119beb5fd9cc678df7662c43bc05ff4ff4baa883544c8e31126276a5d4f20d"} Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.115701 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gcnq" event={"ID":"209bf713-7d49-4554-96bd-4922d360dbe7","Type":"ContainerStarted","Data":"956888b55c909fb759eeef869bef8bc3a548c1841e9fd770388f0d62ecdf4107"} Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.125713 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbswz" event={"ID":"45a1e640-3aeb-47f7-8a26-a578cf7d7c18","Type":"ContainerStarted","Data":"92f04db3536e0405409f07fb4b21c6fcab060641c74e95c5b8f9e9ed90475f80"} Feb 17 17:53:33 crc kubenswrapper[4762]: I0217 17:53:33.175148 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbswz" podStartSLOduration=1.766757006 podStartE2EDuration="4.175131538s" podCreationTimestamp="2026-02-17 17:53:29 +0000 UTC" firstStartedPulling="2026-02-17 17:53:30.078495554 +0000 UTC m=+361.723413564" lastFinishedPulling="2026-02-17 17:53:32.486870076 +0000 UTC m=+364.131788096" observedRunningTime="2026-02-17 17:53:33.172761497 +0000 UTC m=+364.817679507" watchObservedRunningTime="2026-02-17 17:53:33.175131538 +0000 UTC m=+364.820049548" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.033469 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.034113 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" podUID="bd19545c-53f3-4b81-8e8f-4293cd706247" containerName="controller-manager" containerID="cri-o://0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb" gracePeriod=30 Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.131496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerStarted","Data":"3f50381c17c399a68870473422c5307fedb545f3ee8622a36fdb0075c2d51b01"} Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.134188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gcnq" event={"ID":"209bf713-7d49-4554-96bd-4922d360dbe7","Type":"ContainerStarted","Data":"8c0f947f75873ec9406efb0304c36c0259a4937a37a1d4678e1323633201389e"} Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.421573 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.558782 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.558856 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.585526 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca\") pod \"bd19545c-53f3-4b81-8e8f-4293cd706247\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.585585 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config\") pod \"bd19545c-53f3-4b81-8e8f-4293cd706247\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.585662 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles\") pod \"bd19545c-53f3-4b81-8e8f-4293cd706247\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.585742 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2w5s\" (UniqueName: \"kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s\") pod \"bd19545c-53f3-4b81-8e8f-4293cd706247\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.585768 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert\") pod \"bd19545c-53f3-4b81-8e8f-4293cd706247\" (UID: \"bd19545c-53f3-4b81-8e8f-4293cd706247\") " Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.586595 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config" (OuterVolumeSpecName: "config") pod "bd19545c-53f3-4b81-8e8f-4293cd706247" (UID: "bd19545c-53f3-4b81-8e8f-4293cd706247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.586586 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd19545c-53f3-4b81-8e8f-4293cd706247" (UID: "bd19545c-53f3-4b81-8e8f-4293cd706247"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.586701 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd19545c-53f3-4b81-8e8f-4293cd706247" (UID: "bd19545c-53f3-4b81-8e8f-4293cd706247"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.590870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd19545c-53f3-4b81-8e8f-4293cd706247" (UID: "bd19545c-53f3-4b81-8e8f-4293cd706247"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.591193 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s" (OuterVolumeSpecName: "kube-api-access-s2w5s") pod "bd19545c-53f3-4b81-8e8f-4293cd706247" (UID: "bd19545c-53f3-4b81-8e8f-4293cd706247"). InnerVolumeSpecName "kube-api-access-s2w5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.687462 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.687493 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2w5s\" (UniqueName: \"kubernetes.io/projected/bd19545c-53f3-4b81-8e8f-4293cd706247-kube-api-access-s2w5s\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.687508 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd19545c-53f3-4b81-8e8f-4293cd706247-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.687517 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:34 crc kubenswrapper[4762]: I0217 17:53:34.687526 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd19545c-53f3-4b81-8e8f-4293cd706247-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.144803 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a63ac5-9c0b-4b15-96ea-7bb2d166525e" containerID="3f50381c17c399a68870473422c5307fedb545f3ee8622a36fdb0075c2d51b01" exitCode=0 Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.144992 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerDied","Data":"3f50381c17c399a68870473422c5307fedb545f3ee8622a36fdb0075c2d51b01"} Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.150663 4762 generic.go:334] "Generic (PLEG): container finished" podID="209bf713-7d49-4554-96bd-4922d360dbe7" containerID="8c0f947f75873ec9406efb0304c36c0259a4937a37a1d4678e1323633201389e" exitCode=0 Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.150801 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gcnq" event={"ID":"209bf713-7d49-4554-96bd-4922d360dbe7","Type":"ContainerDied","Data":"8c0f947f75873ec9406efb0304c36c0259a4937a37a1d4678e1323633201389e"} Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.155552 4762 generic.go:334] "Generic (PLEG): container finished" podID="bd19545c-53f3-4b81-8e8f-4293cd706247" containerID="0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb" exitCode=0 Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.155598 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" event={"ID":"bd19545c-53f3-4b81-8e8f-4293cd706247","Type":"ContainerDied","Data":"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb"} Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.155655 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" event={"ID":"bd19545c-53f3-4b81-8e8f-4293cd706247","Type":"ContainerDied","Data":"5dc08bfa83203c274815ba506890fd9e6eeef93245fa53c92c55b04987ea1dde"} Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.155678 4762 scope.go:117] "RemoveContainer" containerID="0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.155692 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.183403 4762 scope.go:117] "RemoveContainer" containerID="0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb" Feb 17 17:53:35 crc kubenswrapper[4762]: E0217 17:53:35.183913 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb\": container with ID starting with 0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb not found: ID does not exist" containerID="0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.183942 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb"} err="failed to get container status \"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb\": rpc error: code = NotFound desc = could not find container \"0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb\": container with ID starting with 0cb51918396011e38e4de46ed30b4f4a2b6b0914fef6e6c706272c25a32751cb not found: ID does not exist" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.184793 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.190571 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5ff6ccf94b-gcs77"] Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.762038 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bb4f986b8-7kz7g"] Feb 17 17:53:35 crc kubenswrapper[4762]: E0217 17:53:35.765550 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd19545c-53f3-4b81-8e8f-4293cd706247" containerName="controller-manager" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.765576 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd19545c-53f3-4b81-8e8f-4293cd706247" containerName="controller-manager" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.765709 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd19545c-53f3-4b81-8e8f-4293cd706247" containerName="controller-manager" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.766197 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.768117 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bb4f986b8-7kz7g"] Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.770659 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.770896 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.770921 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.770682 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.771191 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.772341 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.776544 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.907963 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-config\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.908029 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-client-ca\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.908104 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-proxy-ca-bundles\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.908141 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-serving-cert\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:35 crc kubenswrapper[4762]: I0217 17:53:35.908176 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnkt\" (UniqueName: \"kubernetes.io/projected/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-kube-api-access-ptnkt\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.009126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-config\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.009209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-client-ca\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.009259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-proxy-ca-bundles\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.009288 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-serving-cert\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.009314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnkt\" (UniqueName: \"kubernetes.io/projected/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-kube-api-access-ptnkt\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.010412 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-client-ca\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.010410 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-proxy-ca-bundles\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.010678 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-config\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.017918 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-serving-cert\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.031414 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnkt\" (UniqueName: \"kubernetes.io/projected/da2cdb5c-7128-4c6c-a6c4-7968b6c45259-kube-api-access-ptnkt\") pod \"controller-manager-bb4f986b8-7kz7g\" (UID: \"da2cdb5c-7128-4c6c-a6c4-7968b6c45259\") " pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.096851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.162641 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gcnq" event={"ID":"209bf713-7d49-4554-96bd-4922d360dbe7","Type":"ContainerStarted","Data":"8a2446924444d234c774448ad91852e8f60d23cc37c14227c8166c4ce94f4b3d"} Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.166439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bzhc" event={"ID":"18a63ac5-9c0b-4b15-96ea-7bb2d166525e","Type":"ContainerStarted","Data":"f863c183e6e82c909c7a88c982190589e4519199303e466e58243d0506d7985d"} Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.184119 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8gcnq" podStartSLOduration=2.694046841 podStartE2EDuration="5.184103062s" podCreationTimestamp="2026-02-17 17:53:31 +0000 UTC" firstStartedPulling="2026-02-17 17:53:33.117266042 +0000 UTC m=+364.762184052" lastFinishedPulling="2026-02-17 17:53:35.607322263 +0000 UTC m=+367.252240273" observedRunningTime="2026-02-17 17:53:36.180287357 +0000 UTC m=+367.825205387" watchObservedRunningTime="2026-02-17 17:53:36.184103062 +0000 UTC m=+367.829021072" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.200370 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7bzhc" podStartSLOduration=2.6364226 podStartE2EDuration="5.200352639s" podCreationTimestamp="2026-02-17 17:53:31 +0000 UTC" firstStartedPulling="2026-02-17 17:53:33.114647783 +0000 UTC m=+364.759565803" lastFinishedPulling="2026-02-17 17:53:35.678577832 +0000 UTC m=+367.323495842" observedRunningTime="2026-02-17 17:53:36.196106492 +0000 UTC m=+367.841024502" watchObservedRunningTime="2026-02-17 17:53:36.200352639 +0000 UTC m=+367.845270649" Feb 17 17:53:36 crc kubenswrapper[4762]: I0217 17:53:36.556896 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bb4f986b8-7kz7g"] Feb 17 17:53:36 crc kubenswrapper[4762]: W0217 17:53:36.561912 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2cdb5c_7128_4c6c_a6c4_7968b6c45259.slice/crio-b020d91fcc31e8c294ec30cf10228aa09a488ab62a549cc406a3c138d813899b WatchSource:0}: Error finding container b020d91fcc31e8c294ec30cf10228aa09a488ab62a549cc406a3c138d813899b: Status 404 returned error can't find the container with id b020d91fcc31e8c294ec30cf10228aa09a488ab62a549cc406a3c138d813899b Feb 17 17:53:37 crc kubenswrapper[4762]: I0217 17:53:37.043223 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd19545c-53f3-4b81-8e8f-4293cd706247" path="/var/lib/kubelet/pods/bd19545c-53f3-4b81-8e8f-4293cd706247/volumes" Feb 17 17:53:37 crc kubenswrapper[4762]: I0217 17:53:37.172717 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" event={"ID":"da2cdb5c-7128-4c6c-a6c4-7968b6c45259","Type":"ContainerStarted","Data":"e14b6e638908a619c161427983c5c9adb8b884ab063cbcfe1f78f3f03c34ffae"} Feb 17 17:53:37 crc kubenswrapper[4762]: I0217 17:53:37.172798 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" event={"ID":"da2cdb5c-7128-4c6c-a6c4-7968b6c45259","Type":"ContainerStarted","Data":"b020d91fcc31e8c294ec30cf10228aa09a488ab62a549cc406a3c138d813899b"} Feb 17 17:53:37 crc kubenswrapper[4762]: I0217 17:53:37.194838 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" podStartSLOduration=3.194818891 podStartE2EDuration="3.194818891s" podCreationTimestamp="2026-02-17 17:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:53:37.18977922 +0000 UTC m=+368.834697220" watchObservedRunningTime="2026-02-17 17:53:37.194818891 +0000 UTC m=+368.839736911" Feb 17 17:53:38 crc kubenswrapper[4762]: I0217 17:53:38.177432 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:38 crc kubenswrapper[4762]: I0217 17:53:38.186330 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bb4f986b8-7kz7g" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.210239 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.210582 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.260369 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.398086 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.398409 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:39 crc kubenswrapper[4762]: I0217 17:53:39.435332 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:40 crc kubenswrapper[4762]: I0217 17:53:40.228888 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69hrp" Feb 17 17:53:40 crc kubenswrapper[4762]: I0217 17:53:40.234059 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbswz" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.584567 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.584946 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.641823 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.841534 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.841594 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:41 crc kubenswrapper[4762]: I0217 17:53:41.886677 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:42 crc kubenswrapper[4762]: I0217 17:53:42.248026 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8gcnq" Feb 17 17:53:42 crc kubenswrapper[4762]: I0217 17:53:42.263859 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7bzhc" Feb 17 17:53:45 crc kubenswrapper[4762]: I0217 17:53:45.855742 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7cztq" Feb 17 17:53:45 crc kubenswrapper[4762]: I0217 17:53:45.918692 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:54:04 crc kubenswrapper[4762]: I0217 17:54:04.558305 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:54:04 crc kubenswrapper[4762]: I0217 17:54:04.559064 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:54:10 crc kubenswrapper[4762]: I0217 17:54:10.954518 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" podUID="15469884-f0fd-4460-97dd-6a428a3e7e0d" containerName="registry" containerID="cri-o://ad519fee8a6ce38d7504d01ac7cccd506064da17ceebec5d294ce9e0d1b98172" gracePeriod=30 Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.358492 4762 generic.go:334] "Generic (PLEG): container finished" podID="15469884-f0fd-4460-97dd-6a428a3e7e0d" containerID="ad519fee8a6ce38d7504d01ac7cccd506064da17ceebec5d294ce9e0d1b98172" exitCode=0 Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.358537 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" event={"ID":"15469884-f0fd-4460-97dd-6a428a3e7e0d","Type":"ContainerDied","Data":"ad519fee8a6ce38d7504d01ac7cccd506064da17ceebec5d294ce9e0d1b98172"} Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.358562 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" event={"ID":"15469884-f0fd-4460-97dd-6a428a3e7e0d","Type":"ContainerDied","Data":"4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f"} Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.358573 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7b462e8ac8c48e6405b9d0ed1aa64dd75872c57199e2d8cba1bd5de1bbaa4f" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.364967 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520507 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520575 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520783 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vkfp\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520872 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520912 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.520985 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls\") pod \"15469884-f0fd-4460-97dd-6a428a3e7e0d\" (UID: \"15469884-f0fd-4460-97dd-6a428a3e7e0d\") " Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.522372 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.522573 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.525707 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.526758 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.527792 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.531427 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp" (OuterVolumeSpecName: "kube-api-access-6vkfp") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "kube-api-access-6vkfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.535723 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.536911 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "15469884-f0fd-4460-97dd-6a428a3e7e0d" (UID: "15469884-f0fd-4460-97dd-6a428a3e7e0d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621904 4762 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15469884-f0fd-4460-97dd-6a428a3e7e0d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621942 4762 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621951 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621960 4762 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621968 4762 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15469884-f0fd-4460-97dd-6a428a3e7e0d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621976 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vkfp\" (UniqueName: \"kubernetes.io/projected/15469884-f0fd-4460-97dd-6a428a3e7e0d-kube-api-access-6vkfp\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:11 crc kubenswrapper[4762]: I0217 17:54:11.621984 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15469884-f0fd-4460-97dd-6a428a3e7e0d-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:54:12 crc kubenswrapper[4762]: I0217 17:54:12.364337 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zc64c" Feb 17 17:54:12 crc kubenswrapper[4762]: I0217 17:54:12.407919 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:54:12 crc kubenswrapper[4762]: I0217 17:54:12.418470 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zc64c"] Feb 17 17:54:13 crc kubenswrapper[4762]: I0217 17:54:13.044226 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15469884-f0fd-4460-97dd-6a428a3e7e0d" path="/var/lib/kubelet/pods/15469884-f0fd-4460-97dd-6a428a3e7e0d/volumes" Feb 17 17:54:34 crc kubenswrapper[4762]: I0217 17:54:34.558702 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:54:34 crc kubenswrapper[4762]: I0217 17:54:34.559498 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:54:34 crc kubenswrapper[4762]: I0217 17:54:34.559563 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:54:34 crc kubenswrapper[4762]: I0217 17:54:34.560350 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:54:34 crc kubenswrapper[4762]: I0217 17:54:34.560449 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871" gracePeriod=600 Feb 17 17:54:35 crc kubenswrapper[4762]: I0217 17:54:35.502366 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871" exitCode=0 Feb 17 17:54:35 crc kubenswrapper[4762]: I0217 17:54:35.502463 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871"} Feb 17 17:54:35 crc kubenswrapper[4762]: I0217 17:54:35.502793 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969"} Feb 17 17:54:35 crc kubenswrapper[4762]: I0217 17:54:35.502824 4762 scope.go:117] "RemoveContainer" containerID="b286973f4d1fa1a309d5ad6bc017e7b8e0c71bb803ea6cae1c8a39fca371aeba" Feb 17 17:56:29 crc kubenswrapper[4762]: I0217 17:56:29.232015 4762 scope.go:117] "RemoveContainer" containerID="4f339e29a8012b009ecc4488734d51cbd8dd75c370e8921e0185c93679ffae6a" Feb 17 17:56:29 crc kubenswrapper[4762]: I0217 17:56:29.255742 4762 scope.go:117] "RemoveContainer" containerID="bd04ac27ea97d97b3ea9a7964120f18f23be95322afc27b8e14d0d1c3f73977e" Feb 17 17:56:29 crc kubenswrapper[4762]: I0217 17:56:29.298657 4762 scope.go:117] "RemoveContainer" containerID="ad519fee8a6ce38d7504d01ac7cccd506064da17ceebec5d294ce9e0d1b98172" Feb 17 17:57:04 crc kubenswrapper[4762]: I0217 17:57:04.558045 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:57:04 crc kubenswrapper[4762]: I0217 17:57:04.558798 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:57:29 crc kubenswrapper[4762]: I0217 17:57:29.330192 4762 scope.go:117] "RemoveContainer" containerID="090d7a1886c00978a221f82900cdc10774825743f5c2f80e6e946d59b359601b" Feb 17 17:57:29 crc kubenswrapper[4762]: I0217 17:57:29.350404 4762 scope.go:117] "RemoveContainer" containerID="b039f6f5b600dcf452e209a62b9643b19075606739c353d1f38d5f39d4da7aa3" Feb 17 17:57:29 crc kubenswrapper[4762]: I0217 17:57:29.370264 4762 scope.go:117] "RemoveContainer" containerID="13f373d199d9720fb011274c1a7c147fc8c17a8ac8f46db551f34a22f49cee72" Feb 17 17:57:29 crc kubenswrapper[4762]: I0217 17:57:29.386177 4762 scope.go:117] "RemoveContainer" containerID="4a17eb1bbde5204fb8d646d0ccbb260444c437077729dcb94f7fb63dcf808e70" Feb 17 17:57:34 crc kubenswrapper[4762]: I0217 17:57:34.558798 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:57:34 crc kubenswrapper[4762]: I0217 17:57:34.559470 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:58:04 crc kubenswrapper[4762]: I0217 17:58:04.558516 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:58:04 crc kubenswrapper[4762]: I0217 17:58:04.559139 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:58:04 crc kubenswrapper[4762]: I0217 17:58:04.559233 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 17:58:04 crc kubenswrapper[4762]: I0217 17:58:04.559994 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:58:04 crc kubenswrapper[4762]: I0217 17:58:04.560056 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969" gracePeriod=600 Feb 17 17:58:05 crc kubenswrapper[4762]: I0217 17:58:05.621056 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969" exitCode=0 Feb 17 17:58:05 crc kubenswrapper[4762]: I0217 17:58:05.621157 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969"} Feb 17 17:58:05 crc kubenswrapper[4762]: I0217 17:58:05.621682 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212"} Feb 17 17:58:05 crc kubenswrapper[4762]: I0217 17:58:05.621707 4762 scope.go:117] "RemoveContainer" containerID="e4fa34e3eae7dd4023f4a8dcdfb848ad377b3ac4763f97bb9696cc12d23a4871" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.117278 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f6zrt"] Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.118397 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-controller" containerID="cri-o://26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.118834 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="sbdb" containerID="cri-o://3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.118881 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="nbdb" containerID="cri-o://383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.118926 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="northd" containerID="cri-o://9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.118965 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.119017 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-node" containerID="cri-o://4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.119081 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-acl-logging" containerID="cri-o://50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.206301 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" containerID="cri-o://230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" gracePeriod=30 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.511968 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/3.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.514762 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovn-acl-logging/0.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.515279 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovn-controller/0.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.515717 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.570734 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fb2n"] Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.570950 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-acl-logging" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.570962 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-acl-logging" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.570972 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="nbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.570980 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="nbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.570988 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.570993 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571003 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-node" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571009 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-node" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571022 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="sbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571030 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="sbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571037 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571043 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571050 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571057 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571066 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="northd" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571072 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="northd" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571078 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571083 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571090 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571095 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571101 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571107 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571116 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kubecfg-setup" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571122 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kubecfg-setup" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571129 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15469884-f0fd-4460-97dd-6a428a3e7e0d" containerName="registry" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571135 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="15469884-f0fd-4460-97dd-6a428a3e7e0d" containerName="registry" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571223 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571233 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="15469884-f0fd-4460-97dd-6a428a3e7e0d" containerName="registry" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571242 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571250 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="sbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571259 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571266 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-node" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571275 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571283 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571292 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="nbdb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571300 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovn-acl-logging" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571307 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="northd" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.571401 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571408 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571505 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.571714 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerName="ovnkube-controller" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.573353 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695597 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695718 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695736 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695779 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695803 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695883 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695924 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695932 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695966 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695994 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.695976 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash" (OuterVolumeSpecName: "host-slash") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696020 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696045 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696011 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696187 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696241 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696308 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696351 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696348 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696447 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696383 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696390 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket" (OuterVolumeSpecName: "log-socket") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696407 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696413 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696415 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696568 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696591 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696669 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696703 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.696762 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwmmf\" (UniqueName: \"kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf\") pod \"8e901c69-4b38-4f54-9811-83bd34c46a07\" (UID: \"8e901c69-4b38-4f54-9811-83bd34c46a07\") " Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697049 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697108 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-env-overrides\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697133 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-script-lib\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-config\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-slash\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697278 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-kubelet\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697355 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-etc-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697420 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-systemd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697457 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovn-node-metrics-cert\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697526 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-netns\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697552 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-bin\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697656 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-ovn\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697711 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-node-log\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697756 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwfbb\" (UniqueName: \"kubernetes.io/projected/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-kube-api-access-zwfbb\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697827 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697872 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-log-socket\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697913 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-systemd-units\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-netd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.697987 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-var-lib-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698055 4762 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698072 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698088 4762 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698101 4762 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698113 4762 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698127 4762 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698140 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698153 4762 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698168 4762 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698180 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698193 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698209 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698224 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698238 4762 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.698311 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log" (OuterVolumeSpecName: "node-log") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.699084 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.699649 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.702364 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.703255 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf" (OuterVolumeSpecName: "kube-api-access-xwmmf") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "kube-api-access-xwmmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.713316 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8e901c69-4b38-4f54-9811-83bd34c46a07" (UID: "8e901c69-4b38-4f54-9811-83bd34c46a07"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.798058 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovnkube-controller/3.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799307 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799380 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovn-node-metrics-cert\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-netns\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799409 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-run-netns\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-bin\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799670 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-ovn\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-node-log\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwfbb\" (UniqueName: \"kubernetes.io/projected/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-kube-api-access-zwfbb\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799765 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-ovn\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-log-socket\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799842 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-systemd-units\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-log-socket\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799869 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-node-log\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799912 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-netd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-bin\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-cni-netd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-var-lib-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-systemd-units\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.799994 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800017 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-env-overrides\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-script-lib\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800060 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800066 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-config\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800032 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-var-lib-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800142 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-slash\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800165 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-slash\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800185 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-kubelet\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-host-kubelet\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800298 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-etc-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-systemd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-etc-openvswitch\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800412 4762 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-run-systemd\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800424 4762 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800450 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e901c69-4b38-4f54-9811-83bd34c46a07-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800464 4762 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e901c69-4b38-4f54-9811-83bd34c46a07-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800477 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwmmf\" (UniqueName: \"kubernetes.io/projected/8e901c69-4b38-4f54-9811-83bd34c46a07-kube-api-access-xwmmf\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800489 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e901c69-4b38-4f54-9811-83bd34c46a07-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800841 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-env-overrides\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.800904 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-script-lib\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.801374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovnkube-config\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.801930 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovn-acl-logging/0.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.802545 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6zrt_8e901c69-4b38-4f54-9811-83bd34c46a07/ovn-controller/0.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803043 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803080 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803091 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803103 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803113 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803123 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" exitCode=0 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803131 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" exitCode=143 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803141 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e901c69-4b38-4f54-9811-83bd34c46a07" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" exitCode=143 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803165 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803083 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803421 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803487 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803502 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803532 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803546 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803531 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803560 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803761 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803769 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803777 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803783 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803788 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803793 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803881 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803890 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803900 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803911 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803918 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803923 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803947 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803952 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803958 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803962 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803968 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803973 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803978 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.803986 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804714 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804759 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804767 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804773 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804783 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804792 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804798 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804805 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804811 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804818 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804836 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6zrt" event={"ID":"8e901c69-4b38-4f54-9811-83bd34c46a07","Type":"ContainerDied","Data":"cd2c6574ad6bea413adcb230281e117866ad87bbee89e734f3d32453093b3cc4"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804730 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-ovn-node-metrics-cert\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804857 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804896 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804906 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804912 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804918 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804922 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804928 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804932 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804937 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.804942 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.806066 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/2.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.806583 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/1.log" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.806616 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0f706d4-18a1-44c0-8913-b46af7876ee7" containerID="e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e" exitCode=2 Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.806670 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerDied","Data":"e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.806747 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0"} Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.807597 4762 scope.go:117] "RemoveContainer" containerID="e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.807859 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-k2xfd_openshift-multus(d0f706d4-18a1-44c0-8913-b46af7876ee7)\"" pod="openshift-multus/multus-k2xfd" podUID="d0f706d4-18a1-44c0-8913-b46af7876ee7" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.821451 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwfbb\" (UniqueName: \"kubernetes.io/projected/11a0f4e2-2d9a-4be1-b836-c3243bc9ba81-kube-api-access-zwfbb\") pod \"ovnkube-node-8fb2n\" (UID: \"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.832278 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.854681 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f6zrt"] Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.857027 4762 scope.go:117] "RemoveContainer" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.859559 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f6zrt"] Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.872339 4762 scope.go:117] "RemoveContainer" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.888827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.889865 4762 scope.go:117] "RemoveContainer" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.908978 4762 scope.go:117] "RemoveContainer" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.925879 4762 scope.go:117] "RemoveContainer" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.943275 4762 scope.go:117] "RemoveContainer" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.957902 4762 scope.go:117] "RemoveContainer" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.977190 4762 scope.go:117] "RemoveContainer" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.996568 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.997278 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.997327 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} err="failed to get container status \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.997378 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.998547 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": container with ID starting with 57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488 not found: ID does not exist" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.998588 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} err="failed to get container status \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": rpc error: code = NotFound desc = could not find container \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": container with ID starting with 57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488 not found: ID does not exist" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.998608 4762 scope.go:117] "RemoveContainer" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.999060 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": container with ID starting with 3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93 not found: ID does not exist" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.999115 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} err="failed to get container status \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": rpc error: code = NotFound desc = could not find container \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": container with ID starting with 3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93 not found: ID does not exist" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.999154 4762 scope.go:117] "RemoveContainer" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:32 crc kubenswrapper[4762]: E0217 17:58:32.999521 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": container with ID starting with 383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448 not found: ID does not exist" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.999550 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} err="failed to get container status \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": rpc error: code = NotFound desc = could not find container \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": container with ID starting with 383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448 not found: ID does not exist" Feb 17 17:58:32 crc kubenswrapper[4762]: I0217 17:58:32.999567 4762 scope.go:117] "RemoveContainer" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:32.999941 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": container with ID starting with 9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50 not found: ID does not exist" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:32.999987 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} err="failed to get container status \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": rpc error: code = NotFound desc = could not find container \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": container with ID starting with 9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.000004 4762 scope.go:117] "RemoveContainer" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:33.000511 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": container with ID starting with 3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64 not found: ID does not exist" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.000586 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} err="failed to get container status \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": rpc error: code = NotFound desc = could not find container \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": container with ID starting with 3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.000653 4762 scope.go:117] "RemoveContainer" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:33.001158 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": container with ID starting with 4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e not found: ID does not exist" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.001188 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} err="failed to get container status \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": rpc error: code = NotFound desc = could not find container \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": container with ID starting with 4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.001211 4762 scope.go:117] "RemoveContainer" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:33.001568 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": container with ID starting with 50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c not found: ID does not exist" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.001662 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} err="failed to get container status \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": rpc error: code = NotFound desc = could not find container \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": container with ID starting with 50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.001703 4762 scope.go:117] "RemoveContainer" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:33.002055 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": container with ID starting with 26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc not found: ID does not exist" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002081 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} err="failed to get container status \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": rpc error: code = NotFound desc = could not find container \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": container with ID starting with 26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002100 4762 scope.go:117] "RemoveContainer" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:33 crc kubenswrapper[4762]: E0217 17:58:33.002487 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": container with ID starting with 2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53 not found: ID does not exist" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002525 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} err="failed to get container status \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": rpc error: code = NotFound desc = could not find container \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": container with ID starting with 2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002552 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002894 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} err="failed to get container status \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.002917 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.003350 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} err="failed to get container status \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": rpc error: code = NotFound desc = could not find container \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": container with ID starting with 57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.003402 4762 scope.go:117] "RemoveContainer" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.003809 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} err="failed to get container status \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": rpc error: code = NotFound desc = could not find container \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": container with ID starting with 3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.003838 4762 scope.go:117] "RemoveContainer" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.004152 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} err="failed to get container status \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": rpc error: code = NotFound desc = could not find container \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": container with ID starting with 383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.004174 4762 scope.go:117] "RemoveContainer" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.004612 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} err="failed to get container status \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": rpc error: code = NotFound desc = could not find container \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": container with ID starting with 9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.004660 4762 scope.go:117] "RemoveContainer" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.005162 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} err="failed to get container status \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": rpc error: code = NotFound desc = could not find container \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": container with ID starting with 3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.005204 4762 scope.go:117] "RemoveContainer" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.005554 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} err="failed to get container status \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": rpc error: code = NotFound desc = could not find container \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": container with ID starting with 4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.005588 4762 scope.go:117] "RemoveContainer" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.006198 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} err="failed to get container status \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": rpc error: code = NotFound desc = could not find container \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": container with ID starting with 50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.006237 4762 scope.go:117] "RemoveContainer" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.007770 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} err="failed to get container status \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": rpc error: code = NotFound desc = could not find container \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": container with ID starting with 26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.007823 4762 scope.go:117] "RemoveContainer" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.008693 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} err="failed to get container status \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": rpc error: code = NotFound desc = could not find container \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": container with ID starting with 2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.008722 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009097 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} err="failed to get container status \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009156 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009436 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} err="failed to get container status \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": rpc error: code = NotFound desc = could not find container \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": container with ID starting with 57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009462 4762 scope.go:117] "RemoveContainer" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009838 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} err="failed to get container status \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": rpc error: code = NotFound desc = could not find container \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": container with ID starting with 3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.009865 4762 scope.go:117] "RemoveContainer" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010258 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} err="failed to get container status \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": rpc error: code = NotFound desc = could not find container \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": container with ID starting with 383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010326 4762 scope.go:117] "RemoveContainer" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010617 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} err="failed to get container status \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": rpc error: code = NotFound desc = could not find container \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": container with ID starting with 9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010655 4762 scope.go:117] "RemoveContainer" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010951 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} err="failed to get container status \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": rpc error: code = NotFound desc = could not find container \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": container with ID starting with 3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.010971 4762 scope.go:117] "RemoveContainer" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011325 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} err="failed to get container status \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": rpc error: code = NotFound desc = could not find container \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": container with ID starting with 4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011348 4762 scope.go:117] "RemoveContainer" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011655 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} err="failed to get container status \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": rpc error: code = NotFound desc = could not find container \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": container with ID starting with 50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011677 4762 scope.go:117] "RemoveContainer" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011968 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} err="failed to get container status \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": rpc error: code = NotFound desc = could not find container \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": container with ID starting with 26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.011993 4762 scope.go:117] "RemoveContainer" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.012250 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} err="failed to get container status \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": rpc error: code = NotFound desc = could not find container \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": container with ID starting with 2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.012274 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.012586 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} err="failed to get container status \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.012617 4762 scope.go:117] "RemoveContainer" containerID="57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013109 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488"} err="failed to get container status \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": rpc error: code = NotFound desc = could not find container \"57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488\": container with ID starting with 57a82d96c069f853bdf210d3709fb4e5641b913dc4d9adcf33d88d6a98916488 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013127 4762 scope.go:117] "RemoveContainer" containerID="3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013443 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93"} err="failed to get container status \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": rpc error: code = NotFound desc = could not find container \"3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93\": container with ID starting with 3ce954f709fa0e2ab5b28b850693e662bba4af80e3ae3a4cd91725f325065e93 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013462 4762 scope.go:117] "RemoveContainer" containerID="383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013818 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448"} err="failed to get container status \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": rpc error: code = NotFound desc = could not find container \"383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448\": container with ID starting with 383be387cfdc9b3c60f7c2ebd558e5866145fd9a56ec92413b5a718cbd852448 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.013866 4762 scope.go:117] "RemoveContainer" containerID="9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.014206 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50"} err="failed to get container status \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": rpc error: code = NotFound desc = could not find container \"9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50\": container with ID starting with 9496c047db7d4d98480f558eef7654e8c5559444749f144d3e340b3b404f4b50 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.014229 4762 scope.go:117] "RemoveContainer" containerID="3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.014549 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64"} err="failed to get container status \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": rpc error: code = NotFound desc = could not find container \"3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64\": container with ID starting with 3abf1d3b39c2a7ee1ab8e8fce2a7147fe1a3ea1acf0b1c0fbe86ebde2d33ad64 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.014572 4762 scope.go:117] "RemoveContainer" containerID="4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018033 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e"} err="failed to get container status \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": rpc error: code = NotFound desc = could not find container \"4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e\": container with ID starting with 4d5189e60ac89c094b76cfec6c827ce15cfd4276bc783c8f4c67e443a8bb709e not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018063 4762 scope.go:117] "RemoveContainer" containerID="50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018497 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c"} err="failed to get container status \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": rpc error: code = NotFound desc = could not find container \"50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c\": container with ID starting with 50ff0563c699fe1e6438cc7c567c5203e2e3b6b1f91f5e9a09f05dcfacc7340c not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018533 4762 scope.go:117] "RemoveContainer" containerID="26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018815 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc"} err="failed to get container status \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": rpc error: code = NotFound desc = could not find container \"26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc\": container with ID starting with 26c149d1e43c97cd03e9975c820c9269d6bb077fd9c7592ca6d15ef65a3718cc not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.018838 4762 scope.go:117] "RemoveContainer" containerID="2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.019242 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53"} err="failed to get container status \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": rpc error: code = NotFound desc = could not find container \"2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53\": container with ID starting with 2a70a65e0875123440a3907908e9b716fdb809d815bde4792231ff7d5398ce53 not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.019278 4762 scope.go:117] "RemoveContainer" containerID="230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.019570 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb"} err="failed to get container status \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": rpc error: code = NotFound desc = could not find container \"230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb\": container with ID starting with 230c3dbe2a4577a5006c2256bdacfa28852f5a58bb89028422f908e03292a7cb not found: ID does not exist" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.043312 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e901c69-4b38-4f54-9811-83bd34c46a07" path="/var/lib/kubelet/pods/8e901c69-4b38-4f54-9811-83bd34c46a07/volumes" Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.814653 4762 generic.go:334] "Generic (PLEG): container finished" podID="11a0f4e2-2d9a-4be1-b836-c3243bc9ba81" containerID="a745ec817d03bf3fc419becb642bf6ee489ee1578b473c082aaa450e34d177e9" exitCode=0 Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.814718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerDied","Data":"a745ec817d03bf3fc419becb642bf6ee489ee1578b473c082aaa450e34d177e9"} Feb 17 17:58:33 crc kubenswrapper[4762]: I0217 17:58:33.815070 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"30567c8f419d784b8c9cde06abb07a082a42da6cb3171a6d31ff97f1583e9128"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.828852 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"70c4050dc2331283fae8f3567794aad30b40ad470daefcd52f55c75641e6b6b7"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.829475 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"9eda5bd60aa58ce54b7c414d7ec100a2d60b251a9bfd680890b3fa87e64a19f2"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.829524 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"d47033e99f8bc1ef61eac95ffd29c00e09008e5e7674dc44fa6feaa6ef5b7c0e"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.829537 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"fed156c6f33e850a7eace4c06f0ce80f8571c4ffaaa9ebc85d6c34398d360ca8"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.829547 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"d2b1e25b355eb0990c18cd083626f1fc2ddf0217d490d4e43906911c47295338"} Feb 17 17:58:34 crc kubenswrapper[4762]: I0217 17:58:34.829557 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"5fec45bd9998ce136c9b9d178b179e7ed2fe06893b08d536e2796542248bc82d"} Feb 17 17:58:36 crc kubenswrapper[4762]: I0217 17:58:36.842308 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"a1ea672dcc1e21ca08b698a6785710455cadf9a5f8304a7701faa32ecb3e74c1"} Feb 17 17:58:39 crc kubenswrapper[4762]: I0217 17:58:39.866738 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" event={"ID":"11a0f4e2-2d9a-4be1-b836-c3243bc9ba81","Type":"ContainerStarted","Data":"622880311d1cafc5527135627a10e0d14dd13ff412d8af64731cc46bc5f4bb17"} Feb 17 17:58:39 crc kubenswrapper[4762]: I0217 17:58:39.867358 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:39 crc kubenswrapper[4762]: I0217 17:58:39.867376 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:39 crc kubenswrapper[4762]: I0217 17:58:39.895020 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:39 crc kubenswrapper[4762]: I0217 17:58:39.895656 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" podStartSLOduration=7.895609623 podStartE2EDuration="7.895609623s" podCreationTimestamp="2026-02-17 17:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:58:39.894230154 +0000 UTC m=+671.539148174" watchObservedRunningTime="2026-02-17 17:58:39.895609623 +0000 UTC m=+671.540527633" Feb 17 17:58:40 crc kubenswrapper[4762]: I0217 17:58:40.874545 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:40 crc kubenswrapper[4762]: I0217 17:58:40.901864 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:58:47 crc kubenswrapper[4762]: I0217 17:58:47.035735 4762 scope.go:117] "RemoveContainer" containerID="e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e" Feb 17 17:58:47 crc kubenswrapper[4762]: E0217 17:58:47.036189 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-k2xfd_openshift-multus(d0f706d4-18a1-44c0-8913-b46af7876ee7)\"" pod="openshift-multus/multus-k2xfd" podUID="d0f706d4-18a1-44c0-8913-b46af7876ee7" Feb 17 17:59:00 crc kubenswrapper[4762]: I0217 17:59:00.035883 4762 scope.go:117] "RemoveContainer" containerID="e19faa18f6cade3c3f82c533ec423e13be43192275899b9259b9cc023d77df2e" Feb 17 17:59:00 crc kubenswrapper[4762]: I0217 17:59:00.976960 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/2.log" Feb 17 17:59:00 crc kubenswrapper[4762]: I0217 17:59:00.977617 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/1.log" Feb 17 17:59:00 crc kubenswrapper[4762]: I0217 17:59:00.977687 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k2xfd" event={"ID":"d0f706d4-18a1-44c0-8913-b46af7876ee7","Type":"ContainerStarted","Data":"fc21d22cb2b3b9502c0773cb86914e50c3f6feb633cac79797ea48af51b1136f"} Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.469068 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd"] Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.470650 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.472465 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.474536 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd"] Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.620486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72nkh\" (UniqueName: \"kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.620565 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.620655 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.721980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72nkh\" (UniqueName: \"kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.722050 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.722073 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.722814 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.722813 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.744028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72nkh\" (UniqueName: \"kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.801050 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.909902 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fb2n" Feb 17 17:59:02 crc kubenswrapper[4762]: I0217 17:59:02.991989 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd"] Feb 17 17:59:04 crc kubenswrapper[4762]: I0217 17:59:04.000453 4762 generic.go:334] "Generic (PLEG): container finished" podID="d89f05a2-322d-448a-91a0-c193c28943a1" containerID="56486858f0c5758561a4ec429659cd2a9e8afefa7976ea9262fe874edd6033c9" exitCode=0 Feb 17 17:59:04 crc kubenswrapper[4762]: I0217 17:59:04.000509 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" event={"ID":"d89f05a2-322d-448a-91a0-c193c28943a1","Type":"ContainerDied","Data":"56486858f0c5758561a4ec429659cd2a9e8afefa7976ea9262fe874edd6033c9"} Feb 17 17:59:04 crc kubenswrapper[4762]: I0217 17:59:04.000540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" event={"ID":"d89f05a2-322d-448a-91a0-c193c28943a1","Type":"ContainerStarted","Data":"2db2a9a88f0a444d655c373ec38757212820df589f86931fbb9773a0e06e97f5"} Feb 17 17:59:04 crc kubenswrapper[4762]: I0217 17:59:04.002960 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:59:06 crc kubenswrapper[4762]: I0217 17:59:06.011075 4762 generic.go:334] "Generic (PLEG): container finished" podID="d89f05a2-322d-448a-91a0-c193c28943a1" containerID="5bc0d196b730b687287250052fb8c86af340183ef68e86d3eb8d3a089ac48c81" exitCode=0 Feb 17 17:59:06 crc kubenswrapper[4762]: I0217 17:59:06.011160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" event={"ID":"d89f05a2-322d-448a-91a0-c193c28943a1","Type":"ContainerDied","Data":"5bc0d196b730b687287250052fb8c86af340183ef68e86d3eb8d3a089ac48c81"} Feb 17 17:59:07 crc kubenswrapper[4762]: I0217 17:59:07.020495 4762 generic.go:334] "Generic (PLEG): container finished" podID="d89f05a2-322d-448a-91a0-c193c28943a1" containerID="537cf82e97504048ef4b799ec1f4db2cc01e35551b9a4d87169b0a0a60eb17e8" exitCode=0 Feb 17 17:59:07 crc kubenswrapper[4762]: I0217 17:59:07.020579 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" event={"ID":"d89f05a2-322d-448a-91a0-c193c28943a1","Type":"ContainerDied","Data":"537cf82e97504048ef4b799ec1f4db2cc01e35551b9a4d87169b0a0a60eb17e8"} Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.287683 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.388914 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72nkh\" (UniqueName: \"kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh\") pod \"d89f05a2-322d-448a-91a0-c193c28943a1\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.389013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle\") pod \"d89f05a2-322d-448a-91a0-c193c28943a1\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.389053 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util\") pod \"d89f05a2-322d-448a-91a0-c193c28943a1\" (UID: \"d89f05a2-322d-448a-91a0-c193c28943a1\") " Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.390210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle" (OuterVolumeSpecName: "bundle") pod "d89f05a2-322d-448a-91a0-c193c28943a1" (UID: "d89f05a2-322d-448a-91a0-c193c28943a1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.395345 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh" (OuterVolumeSpecName: "kube-api-access-72nkh") pod "d89f05a2-322d-448a-91a0-c193c28943a1" (UID: "d89f05a2-322d-448a-91a0-c193c28943a1"). InnerVolumeSpecName "kube-api-access-72nkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.489745 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.489775 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72nkh\" (UniqueName: \"kubernetes.io/projected/d89f05a2-322d-448a-91a0-c193c28943a1-kube-api-access-72nkh\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.617067 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util" (OuterVolumeSpecName: "util") pod "d89f05a2-322d-448a-91a0-c193c28943a1" (UID: "d89f05a2-322d-448a-91a0-c193c28943a1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:08 crc kubenswrapper[4762]: I0217 17:59:08.692664 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d89f05a2-322d-448a-91a0-c193c28943a1-util\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:09 crc kubenswrapper[4762]: I0217 17:59:09.032045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" event={"ID":"d89f05a2-322d-448a-91a0-c193c28943a1","Type":"ContainerDied","Data":"2db2a9a88f0a444d655c373ec38757212820df589f86931fbb9773a0e06e97f5"} Feb 17 17:59:09 crc kubenswrapper[4762]: I0217 17:59:09.032091 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db2a9a88f0a444d655c373ec38757212820df589f86931fbb9773a0e06e97f5" Feb 17 17:59:09 crc kubenswrapper[4762]: I0217 17:59:09.032172 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.290383 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74"] Feb 17 17:59:18 crc kubenswrapper[4762]: E0217 17:59:18.291106 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="pull" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.291117 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="pull" Feb 17 17:59:18 crc kubenswrapper[4762]: E0217 17:59:18.291130 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="extract" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.291136 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="extract" Feb 17 17:59:18 crc kubenswrapper[4762]: E0217 17:59:18.291149 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="util" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.291155 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="util" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.291241 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89f05a2-322d-448a-91a0-c193c28943a1" containerName="extract" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.291637 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.293469 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.293612 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.294064 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.296523 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hrgcq" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.304897 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.313231 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74"] Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.388571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-webhook-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.388640 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-apiservice-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.388819 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7hl\" (UniqueName: \"kubernetes.io/projected/ea3ffdb1-8694-4cc4-90df-653c25a14fac-kube-api-access-mc7hl\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.490423 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-webhook-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.490494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-apiservice-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.490549 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7hl\" (UniqueName: \"kubernetes.io/projected/ea3ffdb1-8694-4cc4-90df-653c25a14fac-kube-api-access-mc7hl\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.496595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-webhook-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.509384 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea3ffdb1-8694-4cc4-90df-653c25a14fac-apiservice-cert\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.528440 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7hl\" (UniqueName: \"kubernetes.io/projected/ea3ffdb1-8694-4cc4-90df-653c25a14fac-kube-api-access-mc7hl\") pod \"metallb-operator-controller-manager-796c5cd795-qwv74\" (UID: \"ea3ffdb1-8694-4cc4-90df-653c25a14fac\") " pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.608260 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.679118 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh"] Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.679787 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.681708 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.681816 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.682039 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-m6qmt" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.690099 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh"] Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.793262 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-apiservice-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.793312 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-webhook-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.793350 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dblvj\" (UniqueName: \"kubernetes.io/projected/adbe61a0-9505-4f77-9775-fc8559ae1231-kube-api-access-dblvj\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.882131 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74"] Feb 17 17:59:18 crc kubenswrapper[4762]: W0217 17:59:18.889407 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3ffdb1_8694_4cc4_90df_653c25a14fac.slice/crio-89d3297b47c0a1b2332b9b817609946e4c65e02bc21f93f8a05c67def39a28cf WatchSource:0}: Error finding container 89d3297b47c0a1b2332b9b817609946e4c65e02bc21f93f8a05c67def39a28cf: Status 404 returned error can't find the container with id 89d3297b47c0a1b2332b9b817609946e4c65e02bc21f93f8a05c67def39a28cf Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.894183 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-apiservice-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.894510 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-webhook-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.894541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dblvj\" (UniqueName: \"kubernetes.io/projected/adbe61a0-9505-4f77-9775-fc8559ae1231-kube-api-access-dblvj\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.897444 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-webhook-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.897478 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/adbe61a0-9505-4f77-9775-fc8559ae1231-apiservice-cert\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.910714 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dblvj\" (UniqueName: \"kubernetes.io/projected/adbe61a0-9505-4f77-9775-fc8559ae1231-kube-api-access-dblvj\") pod \"metallb-operator-webhook-server-85df54ff8f-pfcdh\" (UID: \"adbe61a0-9505-4f77-9775-fc8559ae1231\") " pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:18 crc kubenswrapper[4762]: I0217 17:59:18.995411 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:19 crc kubenswrapper[4762]: I0217 17:59:19.225958 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh"] Feb 17 17:59:19 crc kubenswrapper[4762]: I0217 17:59:19.377521 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" event={"ID":"ea3ffdb1-8694-4cc4-90df-653c25a14fac","Type":"ContainerStarted","Data":"89d3297b47c0a1b2332b9b817609946e4c65e02bc21f93f8a05c67def39a28cf"} Feb 17 17:59:19 crc kubenswrapper[4762]: I0217 17:59:19.378654 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" event={"ID":"adbe61a0-9505-4f77-9775-fc8559ae1231","Type":"ContainerStarted","Data":"18368461edefe2e85945e75c9c28d883249dbaf4e8bb49b6a796428ea03d6b68"} Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.406248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" event={"ID":"ea3ffdb1-8694-4cc4-90df-653c25a14fac","Type":"ContainerStarted","Data":"04bd5f1dada99b4519dc80f949da6a06dc3a7f5adb4d47cdf8c16878cf9e3ce9"} Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.407275 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.408248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" event={"ID":"adbe61a0-9505-4f77-9775-fc8559ae1231","Type":"ContainerStarted","Data":"8527b8a0094cfd4387f18006e49ecf353328a811a55d82f7e491dcf660c33c5b"} Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.408393 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.428689 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" podStartSLOduration=1.3444986939999999 podStartE2EDuration="5.428669405s" podCreationTimestamp="2026-02-17 17:59:18 +0000 UTC" firstStartedPulling="2026-02-17 17:59:18.89102431 +0000 UTC m=+710.535942320" lastFinishedPulling="2026-02-17 17:59:22.975195021 +0000 UTC m=+714.620113031" observedRunningTime="2026-02-17 17:59:23.425197165 +0000 UTC m=+715.070115175" watchObservedRunningTime="2026-02-17 17:59:23.428669405 +0000 UTC m=+715.073587415" Feb 17 17:59:23 crc kubenswrapper[4762]: I0217 17:59:23.451462 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" podStartSLOduration=1.642916753 podStartE2EDuration="5.451439167s" podCreationTimestamp="2026-02-17 17:59:18 +0000 UTC" firstStartedPulling="2026-02-17 17:59:19.232857524 +0000 UTC m=+710.877775534" lastFinishedPulling="2026-02-17 17:59:23.041379938 +0000 UTC m=+714.686297948" observedRunningTime="2026-02-17 17:59:23.445061594 +0000 UTC m=+715.089979604" watchObservedRunningTime="2026-02-17 17:59:23.451439167 +0000 UTC m=+715.096357177" Feb 17 17:59:29 crc kubenswrapper[4762]: I0217 17:59:29.439671 4762 scope.go:117] "RemoveContainer" containerID="88fc815166fa6c0085645fe21f8f72b46f7def0e905a83810ae8be5eafc28fe0" Feb 17 17:59:30 crc kubenswrapper[4762]: I0217 17:59:30.451274 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k2xfd_d0f706d4-18a1-44c0-8913-b46af7876ee7/kube-multus/2.log" Feb 17 17:59:38 crc kubenswrapper[4762]: I0217 17:59:38.999363 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-85df54ff8f-pfcdh" Feb 17 17:59:58 crc kubenswrapper[4762]: I0217 17:59:58.611044 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-796c5cd795-qwv74" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.255120 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fb2tl"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.257279 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.259884 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.260179 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7plhd" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.260685 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.270092 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.270741 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: W0217 17:59:59.272851 4762 reflector.go:561] object-"metallb-system"/"frr-k8s-webhook-server-cert": failed to list *v1.Secret: secrets "frr-k8s-webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.272892 4762 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-k8s-webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"frr-k8s-webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.289927 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.350923 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-conf\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.350976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-startup\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbs6l\" (UniqueName: \"kubernetes.io/projected/b25f9642-b43c-436a-821d-383a0912cd63-kube-api-access-fbs6l\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351036 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-reloader\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351066 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351143 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-sockets\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351165 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351253 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cwdl\" (UniqueName: \"kubernetes.io/projected/6b32c016-322c-462b-b41d-c880ce8bd1ac-kube-api-access-2cwdl\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.351291 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.352942 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mdv5x"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.353915 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.356064 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-s2nbd" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.356338 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.356540 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.357399 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.389701 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-n248r"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.390494 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.398273 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.422009 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-n248r"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452771 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7bp\" (UniqueName: \"kubernetes.io/projected/feadf162-5dc5-42c5-9c7e-b36a1659213b-kube-api-access-8t7bp\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452816 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-sockets\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452856 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cwdl\" (UniqueName: \"kubernetes.io/projected/6b32c016-322c-462b-b41d-c880ce8bd1ac-kube-api-access-2cwdl\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452876 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbjv\" (UniqueName: \"kubernetes.io/projected/242cfeca-c170-4125-8784-ffdf74df96d5-kube-api-access-hdbjv\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452895 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452910 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-conf\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-startup\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452954 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbs6l\" (UniqueName: \"kubernetes.io/projected/b25f9642-b43c-436a-821d-383a0912cd63-kube-api-access-fbs6l\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452973 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-metrics-certs\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.452989 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-cert\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.453010 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/feadf162-5dc5-42c5-9c7e-b36a1659213b-metallb-excludel2\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.453027 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.453050 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-reloader\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.453516 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-reloader\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.453817 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.453874 4762 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.453953 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs podName:6b32c016-322c-462b-b41d-c880ce8bd1ac nodeName:}" failed. No retries permitted until 2026-02-17 17:59:59.953931527 +0000 UTC m=+751.598849607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs") pod "frr-k8s-fb2tl" (UID: "6b32c016-322c-462b-b41d-c880ce8bd1ac") : secret "frr-k8s-certs-secret" not found Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.454298 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-sockets\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.454418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-conf\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.454956 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6b32c016-322c-462b-b41d-c880ce8bd1ac-frr-startup\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.478329 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cwdl\" (UniqueName: \"kubernetes.io/projected/6b32c016-322c-462b-b41d-c880ce8bd1ac-kube-api-access-2cwdl\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.481345 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbs6l\" (UniqueName: \"kubernetes.io/projected/b25f9642-b43c-436a-821d-383a0912cd63-kube-api-access-fbs6l\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.553945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.554244 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7bp\" (UniqueName: \"kubernetes.io/projected/feadf162-5dc5-42c5-9c7e-b36a1659213b-kube-api-access-8t7bp\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.554148 4762 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.554917 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist podName:feadf162-5dc5-42c5-9c7e-b36a1659213b nodeName:}" failed. No retries permitted until 2026-02-17 18:00:00.05490014 +0000 UTC m=+751.699818140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist") pod "speaker-mdv5x" (UID: "feadf162-5dc5-42c5-9c7e-b36a1659213b") : secret "metallb-memberlist" not found Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.554722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbjv\" (UniqueName: \"kubernetes.io/projected/242cfeca-c170-4125-8784-ffdf74df96d5-kube-api-access-hdbjv\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.555236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-metrics-certs\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.555765 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-cert\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.555865 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/feadf162-5dc5-42c5-9c7e-b36a1659213b-metallb-excludel2\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.555964 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.556132 4762 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 17 17:59:59 crc kubenswrapper[4762]: E0217 17:59:59.556241 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs podName:feadf162-5dc5-42c5-9c7e-b36a1659213b nodeName:}" failed. No retries permitted until 2026-02-17 18:00:00.056230438 +0000 UTC m=+751.701148448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs") pod "speaker-mdv5x" (UID: "feadf162-5dc5-42c5-9c7e-b36a1659213b") : secret "speaker-certs-secret" not found Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.556791 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/feadf162-5dc5-42c5-9c7e-b36a1659213b-metallb-excludel2\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.559962 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.560367 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-metrics-certs\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.571039 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/242cfeca-c170-4125-8784-ffdf74df96d5-cert\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.576212 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7bp\" (UniqueName: \"kubernetes.io/projected/feadf162-5dc5-42c5-9c7e-b36a1659213b-kube-api-access-8t7bp\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.585112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbjv\" (UniqueName: \"kubernetes.io/projected/242cfeca-c170-4125-8784-ffdf74df96d5-kube-api-access-hdbjv\") pod \"controller-69bbfbf88f-n248r\" (UID: \"242cfeca-c170-4125-8784-ffdf74df96d5\") " pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.705172 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.884827 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-n248r"] Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.960037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 17:59:59 crc kubenswrapper[4762]: I0217 17:59:59.965397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b32c016-322c-462b-b41d-c880ce8bd1ac-metrics-certs\") pod \"frr-k8s-fb2tl\" (UID: \"6b32c016-322c-462b-b41d-c880ce8bd1ac\") " pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.061159 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.061265 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 18:00:00 crc kubenswrapper[4762]: E0217 18:00:00.061277 4762 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 18:00:00 crc kubenswrapper[4762]: E0217 18:00:00.061334 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist podName:feadf162-5dc5-42c5-9c7e-b36a1659213b nodeName:}" failed. No retries permitted until 2026-02-17 18:00:01.06131631 +0000 UTC m=+752.706234320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist") pod "speaker-mdv5x" (UID: "feadf162-5dc5-42c5-9c7e-b36a1659213b") : secret "metallb-memberlist" not found Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.065803 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-metrics-certs\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.139637 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk"] Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.140334 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.142453 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.142547 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.150350 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk"] Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.162019 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.162070 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5m92\" (UniqueName: \"kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.162132 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.177038 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.263004 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.263075 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.263115 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5m92\" (UniqueName: \"kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.264051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.268383 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.277608 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5m92\" (UniqueName: \"kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92\") pod \"collect-profiles-29522520-8w5dk\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: E0217 18:00:00.455195 4762 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:00 crc kubenswrapper[4762]: E0217 18:00:00.455576 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert podName:b25f9642-b43c-436a-821d-383a0912cd63 nodeName:}" failed. No retries permitted until 2026-02-17 18:00:00.955555535 +0000 UTC m=+752.600473555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert") pod "frr-k8s-webhook-server-78b44bf5bb-v84sn" (UID: "b25f9642-b43c-436a-821d-383a0912cd63") : failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.542659 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.673321 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"1bd406a4700cef8e6d46ea2c36b11e8ca35db12c917134ec71bf95af701966f2"} Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.675188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n248r" event={"ID":"242cfeca-c170-4125-8784-ffdf74df96d5","Type":"ContainerStarted","Data":"09ce3673effa8743e98c6103a0fde26c0c1465733696e33fcebeb37902611eaf"} Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.675233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n248r" event={"ID":"242cfeca-c170-4125-8784-ffdf74df96d5","Type":"ContainerStarted","Data":"ee47ab6e57ec46726e3f92b189a2bc621611fda0ed5f919220d9b6694ad37238"} Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.755192 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk"] Feb 17 18:00:00 crc kubenswrapper[4762]: W0217 18:00:00.763879 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72c717d_61ac_4cf4_9b43_864a772f6b78.slice/crio-ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad WatchSource:0}: Error finding container ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad: Status 404 returned error can't find the container with id ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.793386 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.974416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 18:00:00 crc kubenswrapper[4762]: I0217 18:00:00.978974 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b25f9642-b43c-436a-821d-383a0912cd63-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-v84sn\" (UID: \"b25f9642-b43c-436a-821d-383a0912cd63\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.075663 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.078787 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/feadf162-5dc5-42c5-9c7e-b36a1659213b-memberlist\") pod \"speaker-mdv5x\" (UID: \"feadf162-5dc5-42c5-9c7e-b36a1659213b\") " pod="metallb-system/speaker-mdv5x" Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.087583 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.170739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mdv5x" Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.330359 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn"] Feb 17 18:00:01 crc kubenswrapper[4762]: W0217 18:00:01.344771 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb25f9642_b43c_436a_821d_383a0912cd63.slice/crio-3bd9d57e369b889dc8bf6ca429f468e33a44ea2d1cab1d46da1ec528a4e73c42 WatchSource:0}: Error finding container 3bd9d57e369b889dc8bf6ca429f468e33a44ea2d1cab1d46da1ec528a4e73c42: Status 404 returned error can't find the container with id 3bd9d57e369b889dc8bf6ca429f468e33a44ea2d1cab1d46da1ec528a4e73c42 Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.681385 4762 generic.go:334] "Generic (PLEG): container finished" podID="e72c717d-61ac-4cf4-9b43-864a772f6b78" containerID="3a03b6f572d479205f797dc6b902d959ba98fd5d19c8a06ef8516ea93d681a1b" exitCode=0 Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.681497 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" event={"ID":"e72c717d-61ac-4cf4-9b43-864a772f6b78","Type":"ContainerDied","Data":"3a03b6f572d479205f797dc6b902d959ba98fd5d19c8a06ef8516ea93d681a1b"} Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.681793 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" event={"ID":"e72c717d-61ac-4cf4-9b43-864a772f6b78","Type":"ContainerStarted","Data":"ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad"} Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.682923 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" event={"ID":"b25f9642-b43c-436a-821d-383a0912cd63","Type":"ContainerStarted","Data":"3bd9d57e369b889dc8bf6ca429f468e33a44ea2d1cab1d46da1ec528a4e73c42"} Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.684340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdv5x" event={"ID":"feadf162-5dc5-42c5-9c7e-b36a1659213b","Type":"ContainerStarted","Data":"8463ae2f2a2105a2902d3b060a8af1872a07bd6b323b94b162037a0f9073775c"} Feb 17 18:00:01 crc kubenswrapper[4762]: I0217 18:00:01.684380 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdv5x" event={"ID":"feadf162-5dc5-42c5-9c7e-b36a1659213b","Type":"ContainerStarted","Data":"660fb93692095a90d7671e9c7c323b2dde704f7fd29fa29935d1b9fc12a4cb94"} Feb 17 18:00:02 crc kubenswrapper[4762]: I0217 18:00:02.959300 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.097091 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume\") pod \"e72c717d-61ac-4cf4-9b43-864a772f6b78\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.097183 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume\") pod \"e72c717d-61ac-4cf4-9b43-864a772f6b78\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.097268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5m92\" (UniqueName: \"kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92\") pod \"e72c717d-61ac-4cf4-9b43-864a772f6b78\" (UID: \"e72c717d-61ac-4cf4-9b43-864a772f6b78\") " Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.097931 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume" (OuterVolumeSpecName: "config-volume") pod "e72c717d-61ac-4cf4-9b43-864a772f6b78" (UID: "e72c717d-61ac-4cf4-9b43-864a772f6b78"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.107824 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92" (OuterVolumeSpecName: "kube-api-access-z5m92") pod "e72c717d-61ac-4cf4-9b43-864a772f6b78" (UID: "e72c717d-61ac-4cf4-9b43-864a772f6b78"). InnerVolumeSpecName "kube-api-access-z5m92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.110079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e72c717d-61ac-4cf4-9b43-864a772f6b78" (UID: "e72c717d-61ac-4cf4-9b43-864a772f6b78"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.198778 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e72c717d-61ac-4cf4-9b43-864a772f6b78-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.198834 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e72c717d-61ac-4cf4-9b43-864a772f6b78-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.198847 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5m92\" (UniqueName: \"kubernetes.io/projected/e72c717d-61ac-4cf4-9b43-864a772f6b78-kube-api-access-z5m92\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.694134 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" event={"ID":"e72c717d-61ac-4cf4-9b43-864a772f6b78","Type":"ContainerDied","Data":"ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad"} Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.694170 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce605bd7584ba5e62e0db3fa645ffa1528f221f54a03d83c13a18359059f55ad" Feb 17 18:00:03 crc kubenswrapper[4762]: I0217 18:00:03.694184 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-8w5dk" Feb 17 18:00:04 crc kubenswrapper[4762]: I0217 18:00:04.015436 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 18:00:04 crc kubenswrapper[4762]: I0217 18:00:04.558202 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:00:04 crc kubenswrapper[4762]: I0217 18:00:04.558551 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:00:16 crc kubenswrapper[4762]: I0217 18:00:16.764577 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n248r" event={"ID":"242cfeca-c170-4125-8784-ffdf74df96d5","Type":"ContainerStarted","Data":"ccb98ac2cf4d1485cce945fb0342112ee8e2806909a6a07a4a5161267829e140"} Feb 17 18:00:16 crc kubenswrapper[4762]: I0217 18:00:16.765026 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 18:00:16 crc kubenswrapper[4762]: I0217 18:00:16.769443 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-n248r" Feb 17 18:00:16 crc kubenswrapper[4762]: I0217 18:00:16.788706 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-n248r" podStartSLOduration=1.37121555 podStartE2EDuration="17.788688189s" podCreationTimestamp="2026-02-17 17:59:59 +0000 UTC" firstStartedPulling="2026-02-17 18:00:00.037141937 +0000 UTC m=+751.682059947" lastFinishedPulling="2026-02-17 18:00:16.454614576 +0000 UTC m=+768.099532586" observedRunningTime="2026-02-17 18:00:16.785394224 +0000 UTC m=+768.430312234" watchObservedRunningTime="2026-02-17 18:00:16.788688189 +0000 UTC m=+768.433606199" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.107179 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:18 crc kubenswrapper[4762]: E0217 18:00:18.112324 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e72c717d-61ac-4cf4-9b43-864a772f6b78" containerName="collect-profiles" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.112422 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e72c717d-61ac-4cf4-9b43-864a772f6b78" containerName="collect-profiles" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.112717 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e72c717d-61ac-4cf4-9b43-864a772f6b78" containerName="collect-profiles" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.114064 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.116817 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.289038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.289090 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8hs\" (UniqueName: \"kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.289132 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.389884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.389967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.389993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8hs\" (UniqueName: \"kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.390381 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.390639 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.418996 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8hs\" (UniqueName: \"kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs\") pod \"redhat-operators-rzwzq\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:18 crc kubenswrapper[4762]: I0217 18:00:18.476033 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:19 crc kubenswrapper[4762]: I0217 18:00:19.992669 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:19 crc kubenswrapper[4762]: W0217 18:00:19.997413 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531338de_084d_4f71_a1c1_fe97a92f9bd3.slice/crio-54e88f5d4f2c31e42e869d41c7c5c86e75a505d32cd2699f0e1fffc1612feb42 WatchSource:0}: Error finding container 54e88f5d4f2c31e42e869d41c7c5c86e75a505d32cd2699f0e1fffc1612feb42: Status 404 returned error can't find the container with id 54e88f5d4f2c31e42e869d41c7c5c86e75a505d32cd2699f0e1fffc1612feb42 Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.796454 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdv5x" event={"ID":"feadf162-5dc5-42c5-9c7e-b36a1659213b","Type":"ContainerStarted","Data":"a8e37e9ee8570399ed42b466c9f391659e1970f6526dda198714fafd57f89bf9"} Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.796881 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mdv5x" Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.800744 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" event={"ID":"b25f9642-b43c-436a-821d-383a0912cd63","Type":"ContainerStarted","Data":"b62db369b5c9da6b11b2029104c2e8f25641bb0cdc81ed8c41df83f9619c342b"} Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.801476 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.801738 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mdv5x" Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.805144 4762 generic.go:334] "Generic (PLEG): container finished" podID="6b32c016-322c-462b-b41d-c880ce8bd1ac" containerID="9ce9cb964b3414e2a3acf8e043dc3747b58af3f0637325e00b40da7583e8c391" exitCode=0 Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.805268 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerDied","Data":"9ce9cb964b3414e2a3acf8e043dc3747b58af3f0637325e00b40da7583e8c391"} Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.809425 4762 generic.go:334] "Generic (PLEG): container finished" podID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerID="25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6" exitCode=0 Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.809471 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerDied","Data":"25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6"} Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.809511 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerStarted","Data":"54e88f5d4f2c31e42e869d41c7c5c86e75a505d32cd2699f0e1fffc1612feb42"} Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.821588 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mdv5x" podStartSLOduration=3.649667743 podStartE2EDuration="21.82154271s" podCreationTimestamp="2026-02-17 17:59:59 +0000 UTC" firstStartedPulling="2026-02-17 18:00:01.463164386 +0000 UTC m=+753.108082386" lastFinishedPulling="2026-02-17 18:00:19.635039333 +0000 UTC m=+771.279957353" observedRunningTime="2026-02-17 18:00:20.818231365 +0000 UTC m=+772.463149375" watchObservedRunningTime="2026-02-17 18:00:20.82154271 +0000 UTC m=+772.466460740" Feb 17 18:00:20 crc kubenswrapper[4762]: I0217 18:00:20.858174 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" podStartSLOduration=3.540169245 podStartE2EDuration="21.858154489s" podCreationTimestamp="2026-02-17 17:59:59 +0000 UTC" firstStartedPulling="2026-02-17 18:00:01.348583593 +0000 UTC m=+752.993501603" lastFinishedPulling="2026-02-17 18:00:19.666568837 +0000 UTC m=+771.311486847" observedRunningTime="2026-02-17 18:00:20.857938933 +0000 UTC m=+772.502856983" watchObservedRunningTime="2026-02-17 18:00:20.858154489 +0000 UTC m=+772.503072499" Feb 17 18:00:21 crc kubenswrapper[4762]: I0217 18:00:21.830933 4762 generic.go:334] "Generic (PLEG): container finished" podID="6b32c016-322c-462b-b41d-c880ce8bd1ac" containerID="ace55315884836c7482087444b45a28479fdd84f3e16bb61faacca2445f1afc9" exitCode=0 Feb 17 18:00:21 crc kubenswrapper[4762]: I0217 18:00:21.831340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerDied","Data":"ace55315884836c7482087444b45a28479fdd84f3e16bb61faacca2445f1afc9"} Feb 17 18:00:21 crc kubenswrapper[4762]: I0217 18:00:21.834735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerStarted","Data":"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a"} Feb 17 18:00:22 crc kubenswrapper[4762]: I0217 18:00:22.844152 4762 generic.go:334] "Generic (PLEG): container finished" podID="6b32c016-322c-462b-b41d-c880ce8bd1ac" containerID="80787876c5f0cfd397f3f91772654ff28bc4f336a1936e1e11708950f9ced9a3" exitCode=0 Feb 17 18:00:22 crc kubenswrapper[4762]: I0217 18:00:22.844238 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerDied","Data":"80787876c5f0cfd397f3f91772654ff28bc4f336a1936e1e11708950f9ced9a3"} Feb 17 18:00:22 crc kubenswrapper[4762]: I0217 18:00:22.847217 4762 generic.go:334] "Generic (PLEG): container finished" podID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerID="39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a" exitCode=0 Feb 17 18:00:22 crc kubenswrapper[4762]: I0217 18:00:22.847267 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerDied","Data":"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.856986 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"9f5317933941fdb976c0b7ef37a63d150b3605fcc7cbeda7a18d00d30ac375b0"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858729 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858751 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"624ee60d6998989c5524f097080e2ae2e416f1913a0c79baf3bbfc55a233977b"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858767 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"2f2fb3c432cf68415fb63f1645dea70e242a3a62a8cfdd539761a6c73f9ddeb2"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858777 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"8ef2603a428c008eb102c1b1cb30807f558c17187f1cb767642e1fbce659f208"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858822 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"e42f510ee4509f8c60c1039ca76f39fd5791804fb3006dcbe9e3f351cdc9a73a"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.858831 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fb2tl" event={"ID":"6b32c016-322c-462b-b41d-c880ce8bd1ac","Type":"ContainerStarted","Data":"8b818de19e63c114e146197870ad6229362c5be370a9483fc4180c2746955b55"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.860409 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerStarted","Data":"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c"} Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.884800 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fb2tl" podStartSLOduration=5.109688735 podStartE2EDuration="24.884782239s" podCreationTimestamp="2026-02-17 17:59:59 +0000 UTC" firstStartedPulling="2026-02-17 18:00:00.29902345 +0000 UTC m=+751.943941460" lastFinishedPulling="2026-02-17 18:00:20.074116954 +0000 UTC m=+771.719034964" observedRunningTime="2026-02-17 18:00:23.88063959 +0000 UTC m=+775.525557610" watchObservedRunningTime="2026-02-17 18:00:23.884782239 +0000 UTC m=+775.529700249" Feb 17 18:00:23 crc kubenswrapper[4762]: I0217 18:00:23.899644 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzwzq" podStartSLOduration=3.427594946 podStartE2EDuration="5.899615124s" podCreationTimestamp="2026-02-17 18:00:18 +0000 UTC" firstStartedPulling="2026-02-17 18:00:20.811550334 +0000 UTC m=+772.456468344" lastFinishedPulling="2026-02-17 18:00:23.283570512 +0000 UTC m=+774.928488522" observedRunningTime="2026-02-17 18:00:23.896904566 +0000 UTC m=+775.541822596" watchObservedRunningTime="2026-02-17 18:00:23.899615124 +0000 UTC m=+775.544533134" Feb 17 18:00:25 crc kubenswrapper[4762]: I0217 18:00:25.178261 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:25 crc kubenswrapper[4762]: I0217 18:00:25.216335 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.302505 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.304242 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.308571 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.310451 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-h974k" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.313040 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.314956 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.500561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwnr\" (UniqueName: \"kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr\") pod \"mariadb-operator-index-swr77\" (UID: \"a6ac3385-2ac0-4ecd-9372-d3f944832571\") " pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.601560 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwnr\" (UniqueName: \"kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr\") pod \"mariadb-operator-index-swr77\" (UID: \"a6ac3385-2ac0-4ecd-9372-d3f944832571\") " pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.619107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwnr\" (UniqueName: \"kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr\") pod \"mariadb-operator-index-swr77\" (UID: \"a6ac3385-2ac0-4ecd-9372-d3f944832571\") " pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.623947 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:26 crc kubenswrapper[4762]: I0217 18:00:26.950104 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:26 crc kubenswrapper[4762]: W0217 18:00:26.959763 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ac3385_2ac0_4ecd_9372_d3f944832571.slice/crio-4b210dd288cf79e8424ac4a19014f7fa3216e40a82c7702682cafa5793ceff5b WatchSource:0}: Error finding container 4b210dd288cf79e8424ac4a19014f7fa3216e40a82c7702682cafa5793ceff5b: Status 404 returned error can't find the container with id 4b210dd288cf79e8424ac4a19014f7fa3216e40a82c7702682cafa5793ceff5b Feb 17 18:00:27 crc kubenswrapper[4762]: I0217 18:00:27.885092 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-swr77" event={"ID":"a6ac3385-2ac0-4ecd-9372-d3f944832571","Type":"ContainerStarted","Data":"4b210dd288cf79e8424ac4a19014f7fa3216e40a82c7702682cafa5793ceff5b"} Feb 17 18:00:28 crc kubenswrapper[4762]: I0217 18:00:28.477008 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:28 crc kubenswrapper[4762]: I0217 18:00:28.477287 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:28 crc kubenswrapper[4762]: I0217 18:00:28.891777 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-swr77" event={"ID":"a6ac3385-2ac0-4ecd-9372-d3f944832571","Type":"ContainerStarted","Data":"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4"} Feb 17 18:00:28 crc kubenswrapper[4762]: I0217 18:00:28.907517 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-swr77" podStartSLOduration=1.74664763 podStartE2EDuration="2.907498047s" podCreationTimestamp="2026-02-17 18:00:26 +0000 UTC" firstStartedPulling="2026-02-17 18:00:26.961653458 +0000 UTC m=+778.606571468" lastFinishedPulling="2026-02-17 18:00:28.122503865 +0000 UTC m=+779.767421885" observedRunningTime="2026-02-17 18:00:28.904529307 +0000 UTC m=+780.549447347" watchObservedRunningTime="2026-02-17 18:00:28.907498047 +0000 UTC m=+780.552416057" Feb 17 18:00:29 crc kubenswrapper[4762]: I0217 18:00:29.521283 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzwzq" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="registry-server" probeResult="failure" output=< Feb 17 18:00:29 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 18:00:29 crc kubenswrapper[4762]: > Feb 17 18:00:29 crc kubenswrapper[4762]: I0217 18:00:29.685998 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.297055 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-q298f"] Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.297860 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.305385 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-q298f"] Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.446975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dmhp\" (UniqueName: \"kubernetes.io/projected/fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693-kube-api-access-6dmhp\") pod \"mariadb-operator-index-q298f\" (UID: \"fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693\") " pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.548711 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dmhp\" (UniqueName: \"kubernetes.io/projected/fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693-kube-api-access-6dmhp\") pod \"mariadb-operator-index-q298f\" (UID: \"fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693\") " pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.567672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dmhp\" (UniqueName: \"kubernetes.io/projected/fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693-kube-api-access-6dmhp\") pod \"mariadb-operator-index-q298f\" (UID: \"fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693\") " pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.617843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.899524 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-swr77" podUID="a6ac3385-2ac0-4ecd-9372-d3f944832571" containerName="registry-server" containerID="cri-o://64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4" gracePeriod=2 Feb 17 18:00:30 crc kubenswrapper[4762]: I0217 18:00:30.997166 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-q298f"] Feb 17 18:00:31 crc kubenswrapper[4762]: W0217 18:00:31.005676 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9ddf77_1b5c_4e67_8c36_f1b8ce9d9693.slice/crio-0b8cf12e7817fb277e050e0bdd246b4de8aa05f2d58369fe82527e90fda0626b WatchSource:0}: Error finding container 0b8cf12e7817fb277e050e0bdd246b4de8aa05f2d58369fe82527e90fda0626b: Status 404 returned error can't find the container with id 0b8cf12e7817fb277e050e0bdd246b4de8aa05f2d58369fe82527e90fda0626b Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.092133 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-v84sn" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.226685 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.365441 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwnr\" (UniqueName: \"kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr\") pod \"a6ac3385-2ac0-4ecd-9372-d3f944832571\" (UID: \"a6ac3385-2ac0-4ecd-9372-d3f944832571\") " Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.372108 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr" (OuterVolumeSpecName: "kube-api-access-clwnr") pod "a6ac3385-2ac0-4ecd-9372-d3f944832571" (UID: "a6ac3385-2ac0-4ecd-9372-d3f944832571"). InnerVolumeSpecName "kube-api-access-clwnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.467057 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwnr\" (UniqueName: \"kubernetes.io/projected/a6ac3385-2ac0-4ecd-9372-d3f944832571-kube-api-access-clwnr\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.907801 4762 generic.go:334] "Generic (PLEG): container finished" podID="a6ac3385-2ac0-4ecd-9372-d3f944832571" containerID="64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4" exitCode=0 Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.907873 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-swr77" event={"ID":"a6ac3385-2ac0-4ecd-9372-d3f944832571","Type":"ContainerDied","Data":"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4"} Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.908340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-swr77" event={"ID":"a6ac3385-2ac0-4ecd-9372-d3f944832571","Type":"ContainerDied","Data":"4b210dd288cf79e8424ac4a19014f7fa3216e40a82c7702682cafa5793ceff5b"} Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.908368 4762 scope.go:117] "RemoveContainer" containerID="64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.907893 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-swr77" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.913081 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q298f" event={"ID":"fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693","Type":"ContainerStarted","Data":"0b8cf12e7817fb277e050e0bdd246b4de8aa05f2d58369fe82527e90fda0626b"} Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.932188 4762 scope.go:117] "RemoveContainer" containerID="64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4" Feb 17 18:00:31 crc kubenswrapper[4762]: E0217 18:00:31.933889 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4\": container with ID starting with 64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4 not found: ID does not exist" containerID="64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.937838 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4"} err="failed to get container status \"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4\": rpc error: code = NotFound desc = could not find container \"64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4\": container with ID starting with 64d3f0084c8a6d5c50fe3882979187ffb12fcd456d3cf54fb582098c83ffced4 not found: ID does not exist" Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.944056 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:31 crc kubenswrapper[4762]: I0217 18:00:31.949735 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-swr77"] Feb 17 18:00:32 crc kubenswrapper[4762]: I0217 18:00:32.920892 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q298f" event={"ID":"fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693","Type":"ContainerStarted","Data":"f46ff8e52535a1ee14517515f06e08a816209629a9f20647ec86082d724f24e2"} Feb 17 18:00:32 crc kubenswrapper[4762]: I0217 18:00:32.940201 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-q298f" podStartSLOduration=1.390978778 podStartE2EDuration="2.940183358s" podCreationTimestamp="2026-02-17 18:00:30 +0000 UTC" firstStartedPulling="2026-02-17 18:00:31.00981388 +0000 UTC m=+782.654731890" lastFinishedPulling="2026-02-17 18:00:32.55901845 +0000 UTC m=+784.203936470" observedRunningTime="2026-02-17 18:00:32.936039486 +0000 UTC m=+784.580957506" watchObservedRunningTime="2026-02-17 18:00:32.940183358 +0000 UTC m=+784.585101368" Feb 17 18:00:33 crc kubenswrapper[4762]: I0217 18:00:33.044466 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ac3385-2ac0-4ecd-9372-d3f944832571" path="/var/lib/kubelet/pods/a6ac3385-2ac0-4ecd-9372-d3f944832571/volumes" Feb 17 18:00:34 crc kubenswrapper[4762]: I0217 18:00:34.558906 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:00:34 crc kubenswrapper[4762]: I0217 18:00:34.559295 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:00:38 crc kubenswrapper[4762]: I0217 18:00:38.523440 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:38 crc kubenswrapper[4762]: I0217 18:00:38.564785 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.182149 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fb2tl" Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.618054 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.618723 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.652300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.890054 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:40 crc kubenswrapper[4762]: I0217 18:00:40.890333 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzwzq" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="registry-server" containerID="cri-o://aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c" gracePeriod=2 Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.010232 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-q298f" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.748662 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.910814 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities\") pod \"531338de-084d-4f71-a1c1-fe97a92f9bd3\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.911170 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh8hs\" (UniqueName: \"kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs\") pod \"531338de-084d-4f71-a1c1-fe97a92f9bd3\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.911218 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content\") pod \"531338de-084d-4f71-a1c1-fe97a92f9bd3\" (UID: \"531338de-084d-4f71-a1c1-fe97a92f9bd3\") " Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.912125 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities" (OuterVolumeSpecName: "utilities") pod "531338de-084d-4f71-a1c1-fe97a92f9bd3" (UID: "531338de-084d-4f71-a1c1-fe97a92f9bd3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.917289 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs" (OuterVolumeSpecName: "kube-api-access-fh8hs") pod "531338de-084d-4f71-a1c1-fe97a92f9bd3" (UID: "531338de-084d-4f71-a1c1-fe97a92f9bd3"). InnerVolumeSpecName "kube-api-access-fh8hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.978275 4762 generic.go:334] "Generic (PLEG): container finished" podID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerID="aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c" exitCode=0 Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.978320 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerDied","Data":"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c"} Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.978343 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzwzq" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.978376 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzwzq" event={"ID":"531338de-084d-4f71-a1c1-fe97a92f9bd3","Type":"ContainerDied","Data":"54e88f5d4f2c31e42e869d41c7c5c86e75a505d32cd2699f0e1fffc1612feb42"} Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.978396 4762 scope.go:117] "RemoveContainer" containerID="aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c" Feb 17 18:00:41 crc kubenswrapper[4762]: I0217 18:00:41.995270 4762 scope.go:117] "RemoveContainer" containerID="39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.012465 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.012517 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh8hs\" (UniqueName: \"kubernetes.io/projected/531338de-084d-4f71-a1c1-fe97a92f9bd3-kube-api-access-fh8hs\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.021425 4762 scope.go:117] "RemoveContainer" containerID="25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.041851 4762 scope.go:117] "RemoveContainer" containerID="aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.043272 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c\": container with ID starting with aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c not found: ID does not exist" containerID="aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.043311 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c"} err="failed to get container status \"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c\": rpc error: code = NotFound desc = could not find container \"aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c\": container with ID starting with aeb70c736de078c085ce0241d95dba26af3be0db69531d7b8244b96b7df5c91c not found: ID does not exist" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.043333 4762 scope.go:117] "RemoveContainer" containerID="39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.043700 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a\": container with ID starting with 39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a not found: ID does not exist" containerID="39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.043739 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a"} err="failed to get container status \"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a\": rpc error: code = NotFound desc = could not find container \"39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a\": container with ID starting with 39aacef57e36795dcdebd8522f7f346470d8bdda6f35f289bf2a690a4cd9cc6a not found: ID does not exist" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.043764 4762 scope.go:117] "RemoveContainer" containerID="25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.044061 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6\": container with ID starting with 25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6 not found: ID does not exist" containerID="25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.044086 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6"} err="failed to get container status \"25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6\": rpc error: code = NotFound desc = could not find container \"25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6\": container with ID starting with 25959946d09581f0c948bf8deca5ea91b6b8363239cc618e554ee0089baa8cb6 not found: ID does not exist" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.044851 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "531338de-084d-4f71-a1c1-fe97a92f9bd3" (UID: "531338de-084d-4f71-a1c1-fe97a92f9bd3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.114651 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/531338de-084d-4f71-a1c1-fe97a92f9bd3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.306751 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.311752 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzwzq"] Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337251 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh"] Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.337502 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337523 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.337543 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ac3385-2ac0-4ecd-9372-d3f944832571" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337552 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ac3385-2ac0-4ecd-9372-d3f944832571" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.337568 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="extract-utilities" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337576 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="extract-utilities" Feb 17 18:00:42 crc kubenswrapper[4762]: E0217 18:00:42.337588 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="extract-content" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337594 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="extract-content" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337726 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ac3385-2ac0-4ecd-9372-d3f944832571" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.337750 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" containerName="registry-server" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.338664 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.341331 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.352434 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh"] Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.519726 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.519780 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.519821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44l8\" (UniqueName: \"kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.621035 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.621092 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.621133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44l8\" (UniqueName: \"kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.621598 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.621860 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.643325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44l8\" (UniqueName: \"kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.651221 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.876680 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh"] Feb 17 18:00:42 crc kubenswrapper[4762]: W0217 18:00:42.882245 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574b2982_5f13_4465_99b9_19a50dd0efd7.slice/crio-89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b WatchSource:0}: Error finding container 89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b: Status 404 returned error can't find the container with id 89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b Feb 17 18:00:42 crc kubenswrapper[4762]: I0217 18:00:42.985436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerStarted","Data":"89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b"} Feb 17 18:00:43 crc kubenswrapper[4762]: I0217 18:00:43.042385 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531338de-084d-4f71-a1c1-fe97a92f9bd3" path="/var/lib/kubelet/pods/531338de-084d-4f71-a1c1-fe97a92f9bd3/volumes" Feb 17 18:00:43 crc kubenswrapper[4762]: I0217 18:00:43.993771 4762 generic.go:334] "Generic (PLEG): container finished" podID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerID="f9c8b15417df76037d9aa8fb9772e08da8c06d194efde435b69e08800a25c132" exitCode=0 Feb 17 18:00:43 crc kubenswrapper[4762]: I0217 18:00:43.993814 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerDied","Data":"f9c8b15417df76037d9aa8fb9772e08da8c06d194efde435b69e08800a25c132"} Feb 17 18:00:45 crc kubenswrapper[4762]: I0217 18:00:45.000242 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerStarted","Data":"7ac4f96683b272419964188cc6d2e74e47b980791af4a29b4124650ac2e7f10e"} Feb 17 18:00:46 crc kubenswrapper[4762]: I0217 18:00:46.012992 4762 generic.go:334] "Generic (PLEG): container finished" podID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerID="7ac4f96683b272419964188cc6d2e74e47b980791af4a29b4124650ac2e7f10e" exitCode=0 Feb 17 18:00:46 crc kubenswrapper[4762]: I0217 18:00:46.013448 4762 generic.go:334] "Generic (PLEG): container finished" podID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerID="7c55b6eba0587f827ee62d6db61a48702670c1a4ee5f4d02977cccf8f361c9fd" exitCode=0 Feb 17 18:00:46 crc kubenswrapper[4762]: I0217 18:00:46.013134 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerDied","Data":"7ac4f96683b272419964188cc6d2e74e47b980791af4a29b4124650ac2e7f10e"} Feb 17 18:00:46 crc kubenswrapper[4762]: I0217 18:00:46.013525 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerDied","Data":"7c55b6eba0587f827ee62d6db61a48702670c1a4ee5f4d02977cccf8f361c9fd"} Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.413970 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.595567 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util\") pod \"574b2982-5f13-4465-99b9-19a50dd0efd7\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.595650 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle\") pod \"574b2982-5f13-4465-99b9-19a50dd0efd7\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.595759 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d44l8\" (UniqueName: \"kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8\") pod \"574b2982-5f13-4465-99b9-19a50dd0efd7\" (UID: \"574b2982-5f13-4465-99b9-19a50dd0efd7\") " Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.596854 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle" (OuterVolumeSpecName: "bundle") pod "574b2982-5f13-4465-99b9-19a50dd0efd7" (UID: "574b2982-5f13-4465-99b9-19a50dd0efd7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.600442 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8" (OuterVolumeSpecName: "kube-api-access-d44l8") pod "574b2982-5f13-4465-99b9-19a50dd0efd7" (UID: "574b2982-5f13-4465-99b9-19a50dd0efd7"). InnerVolumeSpecName "kube-api-access-d44l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.609264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util" (OuterVolumeSpecName: "util") pod "574b2982-5f13-4465-99b9-19a50dd0efd7" (UID: "574b2982-5f13-4465-99b9-19a50dd0efd7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.697683 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d44l8\" (UniqueName: \"kubernetes.io/projected/574b2982-5f13-4465-99b9-19a50dd0efd7-kube-api-access-d44l8\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.697739 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:47 crc kubenswrapper[4762]: I0217 18:00:47.697753 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/574b2982-5f13-4465-99b9-19a50dd0efd7-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:48 crc kubenswrapper[4762]: I0217 18:00:48.032151 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" event={"ID":"574b2982-5f13-4465-99b9-19a50dd0efd7","Type":"ContainerDied","Data":"89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b"} Feb 17 18:00:48 crc kubenswrapper[4762]: I0217 18:00:48.032570 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89dc6f1755b68275d8502d7d8b09a9b4af68b435d85e54bd441a5d12ac77266b" Feb 17 18:00:48 crc kubenswrapper[4762]: I0217 18:00:48.032243 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.498242 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv"] Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.498855 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="pull" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.498869 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="pull" Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.498880 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="extract" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.498886 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="extract" Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.498901 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="util" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.498907 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="util" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.499014 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="574b2982-5f13-4465-99b9-19a50dd0efd7" containerName="extract" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.499387 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: W0217 18:00:51.502021 4762 reflector.go:561] object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "mariadb-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.502070 4762 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"mariadb-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"mariadb-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 18:00:51 crc kubenswrapper[4762]: W0217 18:00:51.502921 4762 reflector.go:561] object-"openstack-operators"/"webhook-server-cert": failed to list *v1.Secret: secrets "webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.502948 4762 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 18:00:51 crc kubenswrapper[4762]: W0217 18:00:51.503051 4762 reflector.go:561] object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-g8xx9": failed to list *v1.Secret: secrets "mariadb-operator-controller-manager-dockercfg-g8xx9" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 17 18:00:51 crc kubenswrapper[4762]: E0217 18:00:51.503144 4762 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"mariadb-operator-controller-manager-dockercfg-g8xx9\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"mariadb-operator-controller-manager-dockercfg-g8xx9\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.521682 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv"] Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.555665 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.555757 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrzp\" (UniqueName: \"kubernetes.io/projected/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-kube-api-access-4nrzp\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.556368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.657440 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.657534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.657563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrzp\" (UniqueName: \"kubernetes.io/projected/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-kube-api-access-4nrzp\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:51 crc kubenswrapper[4762]: I0217 18:00:51.680501 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrzp\" (UniqueName: \"kubernetes.io/projected/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-kube-api-access-4nrzp\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:52 crc kubenswrapper[4762]: I0217 18:00:52.573115 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-g8xx9" Feb 17 18:00:52 crc kubenswrapper[4762]: E0217 18:00:52.658763 4762 secret.go:188] Couldn't get secret openstack-operators/mariadb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:52 crc kubenswrapper[4762]: E0217 18:00:52.658874 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert podName:2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d nodeName:}" failed. No retries permitted until 2026-02-17 18:00:53.158851226 +0000 UTC m=+804.803769236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert") pod "mariadb-operator-controller-manager-848b445c8d-6w6cv" (UID: "2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d") : failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:52 crc kubenswrapper[4762]: E0217 18:00:52.658782 4762 secret.go:188] Couldn't get secret openstack-operators/mariadb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:52 crc kubenswrapper[4762]: E0217 18:00:52.658965 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert podName:2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d nodeName:}" failed. No retries permitted until 2026-02-17 18:00:53.158944788 +0000 UTC m=+804.803862798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert") pod "mariadb-operator-controller-manager-848b445c8d-6w6cv" (UID: "2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d") : failed to sync secret cache: timed out waiting for the condition Feb 17 18:00:52 crc kubenswrapper[4762]: I0217 18:00:52.766173 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Feb 17 18:00:52 crc kubenswrapper[4762]: I0217 18:00:52.782801 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.177656 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.177756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.185260 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-webhook-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.188689 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d-apiservice-cert\") pod \"mariadb-operator-controller-manager-848b445c8d-6w6cv\" (UID: \"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d\") " pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.316084 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:53 crc kubenswrapper[4762]: I0217 18:00:53.541271 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv"] Feb 17 18:00:54 crc kubenswrapper[4762]: I0217 18:00:54.065929 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" event={"ID":"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d","Type":"ContainerStarted","Data":"20d1da555cd092b9b81d37509805357b336f222a5e6ce5976b37e1b50e52b017"} Feb 17 18:00:58 crc kubenswrapper[4762]: I0217 18:00:58.089706 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" event={"ID":"2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d","Type":"ContainerStarted","Data":"932533cd649549eecfcf65a3f27ec6ebe1d2e264b0732aafdd2ed8061096086c"} Feb 17 18:00:58 crc kubenswrapper[4762]: I0217 18:00:58.090288 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:00:58 crc kubenswrapper[4762]: I0217 18:00:58.106517 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" podStartSLOduration=3.672698604 podStartE2EDuration="7.106499949s" podCreationTimestamp="2026-02-17 18:00:51 +0000 UTC" firstStartedPulling="2026-02-17 18:00:53.558426633 +0000 UTC m=+805.203344653" lastFinishedPulling="2026-02-17 18:00:56.992227968 +0000 UTC m=+808.637145998" observedRunningTime="2026-02-17 18:00:58.104685839 +0000 UTC m=+809.749603859" watchObservedRunningTime="2026-02-17 18:00:58.106499949 +0000 UTC m=+809.751417959" Feb 17 18:01:03 crc kubenswrapper[4762]: I0217 18:01:03.322353 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-848b445c8d-6w6cv" Feb 17 18:01:04 crc kubenswrapper[4762]: I0217 18:01:04.558406 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:01:04 crc kubenswrapper[4762]: I0217 18:01:04.558759 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:01:04 crc kubenswrapper[4762]: I0217 18:01:04.558806 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 18:01:04 crc kubenswrapper[4762]: I0217 18:01:04.559366 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:01:04 crc kubenswrapper[4762]: I0217 18:01:04.559410 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212" gracePeriod=600 Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.125855 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212" exitCode=0 Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.125925 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212"} Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.126196 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de"} Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.126219 4762 scope.go:117] "RemoveContainer" containerID="2385971b9fcd24d1f36cb6487bdb22e9d18c3d925f8b573f1c69c4a33a447969" Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.695531 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-w5gj7"] Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.696216 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.698667 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-b4z2r" Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.712073 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-w5gj7"] Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.740581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/e476b42e-39ec-4ac9-85c3-b71c41139171-kube-api-access-htkx7\") pod \"infra-operator-index-w5gj7\" (UID: \"e476b42e-39ec-4ac9-85c3-b71c41139171\") " pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.841678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/e476b42e-39ec-4ac9-85c3-b71c41139171-kube-api-access-htkx7\") pod \"infra-operator-index-w5gj7\" (UID: \"e476b42e-39ec-4ac9-85c3-b71c41139171\") " pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:05 crc kubenswrapper[4762]: I0217 18:01:05.868468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/e476b42e-39ec-4ac9-85c3-b71c41139171-kube-api-access-htkx7\") pod \"infra-operator-index-w5gj7\" (UID: \"e476b42e-39ec-4ac9-85c3-b71c41139171\") " pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:06 crc kubenswrapper[4762]: I0217 18:01:06.021782 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:06 crc kubenswrapper[4762]: I0217 18:01:06.269718 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-w5gj7"] Feb 17 18:01:06 crc kubenswrapper[4762]: W0217 18:01:06.280442 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode476b42e_39ec_4ac9_85c3_b71c41139171.slice/crio-eae21244160ee523bc28faebd307a6068d5f3be9e183f1fbc3262b065f452588 WatchSource:0}: Error finding container eae21244160ee523bc28faebd307a6068d5f3be9e183f1fbc3262b065f452588: Status 404 returned error can't find the container with id eae21244160ee523bc28faebd307a6068d5f3be9e183f1fbc3262b065f452588 Feb 17 18:01:07 crc kubenswrapper[4762]: I0217 18:01:07.147841 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-w5gj7" event={"ID":"e476b42e-39ec-4ac9-85c3-b71c41139171","Type":"ContainerStarted","Data":"bcf557ccdb46e381d57f132866542ebd2187851aae7ae9915b3891d939e66fab"} Feb 17 18:01:07 crc kubenswrapper[4762]: I0217 18:01:07.148213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-w5gj7" event={"ID":"e476b42e-39ec-4ac9-85c3-b71c41139171","Type":"ContainerStarted","Data":"eae21244160ee523bc28faebd307a6068d5f3be9e183f1fbc3262b065f452588"} Feb 17 18:01:07 crc kubenswrapper[4762]: I0217 18:01:07.168580 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-w5gj7" podStartSLOduration=1.4478627 podStartE2EDuration="2.168544476s" podCreationTimestamp="2026-02-17 18:01:05 +0000 UTC" firstStartedPulling="2026-02-17 18:01:06.282899967 +0000 UTC m=+817.927817977" lastFinishedPulling="2026-02-17 18:01:07.003581743 +0000 UTC m=+818.648499753" observedRunningTime="2026-02-17 18:01:07.164000172 +0000 UTC m=+818.808918202" watchObservedRunningTime="2026-02-17 18:01:07.168544476 +0000 UTC m=+818.813462506" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.695611 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.697140 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.712787 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.784131 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzn9\" (UniqueName: \"kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.784335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.784394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.885885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzn9\" (UniqueName: \"kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.885991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.886013 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.886473 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.886617 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:15 crc kubenswrapper[4762]: I0217 18:01:15.905850 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzn9\" (UniqueName: \"kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9\") pod \"redhat-marketplace-h6s9g\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.022228 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.022562 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.050161 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.052780 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.243919 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:16 crc kubenswrapper[4762]: I0217 18:01:16.258590 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-w5gj7" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.214853 4762 generic.go:334] "Generic (PLEG): container finished" podID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerID="3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e" exitCode=0 Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.215812 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerDied","Data":"3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e"} Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.215895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerStarted","Data":"2bc9fa36ad5173351d8a540450e22f0dae1f30e152df3d98b68da3c4ed2e3543"} Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.335516 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56"] Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.337430 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.340293 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.341729 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56"] Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.407314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.407640 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjtfr\" (UniqueName: \"kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.407769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.509370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.509814 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjtfr\" (UniqueName: \"kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.509858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.510163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.510292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.528707 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjtfr\" (UniqueName: \"kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:17 crc kubenswrapper[4762]: I0217 18:01:17.656030 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:18 crc kubenswrapper[4762]: I0217 18:01:18.089388 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56"] Feb 17 18:01:18 crc kubenswrapper[4762]: I0217 18:01:18.221692 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerStarted","Data":"529dcd5d2ee5fc5c6ee9877becd84e4757ca69546bc948c2775c8e520d796d34"} Feb 17 18:01:18 crc kubenswrapper[4762]: I0217 18:01:18.221739 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerStarted","Data":"286e3614f39401b16dd194a6d5ef8ccf5cde228e64c49b241f79891bb8b67e2e"} Feb 17 18:01:18 crc kubenswrapper[4762]: I0217 18:01:18.225505 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerStarted","Data":"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467"} Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.232335 4762 generic.go:334] "Generic (PLEG): container finished" podID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerID="cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467" exitCode=0 Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.232539 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerDied","Data":"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467"} Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.233321 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerStarted","Data":"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc"} Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.234808 4762 generic.go:334] "Generic (PLEG): container finished" podID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerID="529dcd5d2ee5fc5c6ee9877becd84e4757ca69546bc948c2775c8e520d796d34" exitCode=0 Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.234835 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerDied","Data":"529dcd5d2ee5fc5c6ee9877becd84e4757ca69546bc948c2775c8e520d796d34"} Feb 17 18:01:19 crc kubenswrapper[4762]: I0217 18:01:19.267562 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6s9g" podStartSLOduration=2.637173705 podStartE2EDuration="4.267534602s" podCreationTimestamp="2026-02-17 18:01:15 +0000 UTC" firstStartedPulling="2026-02-17 18:01:17.217654615 +0000 UTC m=+828.862572625" lastFinishedPulling="2026-02-17 18:01:18.848015502 +0000 UTC m=+830.492933522" observedRunningTime="2026-02-17 18:01:19.258224449 +0000 UTC m=+830.903142459" watchObservedRunningTime="2026-02-17 18:01:19.267534602 +0000 UTC m=+830.912452632" Feb 17 18:01:21 crc kubenswrapper[4762]: I0217 18:01:21.250602 4762 generic.go:334] "Generic (PLEG): container finished" podID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerID="bd122884d3e43577912af7b5246b4815f1788b91d4e8c1253ee7454d311def4e" exitCode=0 Feb 17 18:01:21 crc kubenswrapper[4762]: I0217 18:01:21.250853 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerDied","Data":"bd122884d3e43577912af7b5246b4815f1788b91d4e8c1253ee7454d311def4e"} Feb 17 18:01:22 crc kubenswrapper[4762]: I0217 18:01:22.260558 4762 generic.go:334] "Generic (PLEG): container finished" podID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerID="7fc481a5478c6429ed957aebe1e54af62199f7e2c6ee0be19d57fcb86524fd37" exitCode=0 Feb 17 18:01:22 crc kubenswrapper[4762]: I0217 18:01:22.260648 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerDied","Data":"7fc481a5478c6429ed957aebe1e54af62199f7e2c6ee0be19d57fcb86524fd37"} Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.538061 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.589340 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util\") pod \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.589509 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjtfr\" (UniqueName: \"kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr\") pod \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.589540 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle\") pod \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\" (UID: \"12aa14d1-1ff5-4325-8792-d43cfd40cf96\") " Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.591780 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle" (OuterVolumeSpecName: "bundle") pod "12aa14d1-1ff5-4325-8792-d43cfd40cf96" (UID: "12aa14d1-1ff5-4325-8792-d43cfd40cf96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.596132 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr" (OuterVolumeSpecName: "kube-api-access-xjtfr") pod "12aa14d1-1ff5-4325-8792-d43cfd40cf96" (UID: "12aa14d1-1ff5-4325-8792-d43cfd40cf96"). InnerVolumeSpecName "kube-api-access-xjtfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.691422 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjtfr\" (UniqueName: \"kubernetes.io/projected/12aa14d1-1ff5-4325-8792-d43cfd40cf96-kube-api-access-xjtfr\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:23 crc kubenswrapper[4762]: I0217 18:01:23.691462 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:24 crc kubenswrapper[4762]: I0217 18:01:24.274074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" event={"ID":"12aa14d1-1ff5-4325-8792-d43cfd40cf96","Type":"ContainerDied","Data":"286e3614f39401b16dd194a6d5ef8ccf5cde228e64c49b241f79891bb8b67e2e"} Feb 17 18:01:24 crc kubenswrapper[4762]: I0217 18:01:24.274596 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286e3614f39401b16dd194a6d5ef8ccf5cde228e64c49b241f79891bb8b67e2e" Feb 17 18:01:24 crc kubenswrapper[4762]: I0217 18:01:24.274120 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56" Feb 17 18:01:24 crc kubenswrapper[4762]: I0217 18:01:24.406509 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util" (OuterVolumeSpecName: "util") pod "12aa14d1-1ff5-4325-8792-d43cfd40cf96" (UID: "12aa14d1-1ff5-4325-8792-d43cfd40cf96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:01:24 crc kubenswrapper[4762]: I0217 18:01:24.502480 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12aa14d1-1ff5-4325-8792-d43cfd40cf96-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:26 crc kubenswrapper[4762]: I0217 18:01:26.053992 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:26 crc kubenswrapper[4762]: I0217 18:01:26.054495 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:26 crc kubenswrapper[4762]: I0217 18:01:26.097092 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:26 crc kubenswrapper[4762]: I0217 18:01:26.343563 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:27 crc kubenswrapper[4762]: I0217 18:01:27.487147 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.308228 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6s9g" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="registry-server" containerID="cri-o://d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc" gracePeriod=2 Feb 17 18:01:29 crc kubenswrapper[4762]: E0217 18:01:29.405492 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92bc7402_f73e_4cc5_980d_10c3043b3ba8.slice/crio-conmon-d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.677884 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.776255 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzzn9\" (UniqueName: \"kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9\") pod \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.777033 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content\") pod \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.777084 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities\") pod \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\" (UID: \"92bc7402-f73e-4cc5-980d-10c3043b3ba8\") " Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.778247 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities" (OuterVolumeSpecName: "utilities") pod "92bc7402-f73e-4cc5-980d-10c3043b3ba8" (UID: "92bc7402-f73e-4cc5-980d-10c3043b3ba8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.783804 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9" (OuterVolumeSpecName: "kube-api-access-pzzn9") pod "92bc7402-f73e-4cc5-980d-10c3043b3ba8" (UID: "92bc7402-f73e-4cc5-980d-10c3043b3ba8"). InnerVolumeSpecName "kube-api-access-pzzn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.802928 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92bc7402-f73e-4cc5-980d-10c3043b3ba8" (UID: "92bc7402-f73e-4cc5-980d-10c3043b3ba8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.878804 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzzn9\" (UniqueName: \"kubernetes.io/projected/92bc7402-f73e-4cc5-980d-10c3043b3ba8-kube-api-access-pzzn9\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.878841 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:29 crc kubenswrapper[4762]: I0217 18:01:29.878850 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bc7402-f73e-4cc5-980d-10c3043b3ba8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.316459 4762 generic.go:334] "Generic (PLEG): container finished" podID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerID="d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc" exitCode=0 Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.316636 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerDied","Data":"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc"} Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.317730 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6s9g" event={"ID":"92bc7402-f73e-4cc5-980d-10c3043b3ba8","Type":"ContainerDied","Data":"2bc9fa36ad5173351d8a540450e22f0dae1f30e152df3d98b68da3c4ed2e3543"} Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.316707 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6s9g" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.317814 4762 scope.go:117] "RemoveContainer" containerID="d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.334741 4762 scope.go:117] "RemoveContainer" containerID="cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.349225 4762 scope.go:117] "RemoveContainer" containerID="3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.370197 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.374861 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6s9g"] Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.382723 4762 scope.go:117] "RemoveContainer" containerID="d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc" Feb 17 18:01:30 crc kubenswrapper[4762]: E0217 18:01:30.383210 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc\": container with ID starting with d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc not found: ID does not exist" containerID="d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.383252 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc"} err="failed to get container status \"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc\": rpc error: code = NotFound desc = could not find container \"d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc\": container with ID starting with d01ba97e4f2b4a66a9eda9013c84896fad55044b7df9aea17318c8f46fdda0fc not found: ID does not exist" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.383280 4762 scope.go:117] "RemoveContainer" containerID="cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467" Feb 17 18:01:30 crc kubenswrapper[4762]: E0217 18:01:30.383634 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467\": container with ID starting with cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467 not found: ID does not exist" containerID="cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.383662 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467"} err="failed to get container status \"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467\": rpc error: code = NotFound desc = could not find container \"cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467\": container with ID starting with cb45458fa281a3aa3e217d7735055c045c4aad9e5ee98f0940fcf8957b2d5467 not found: ID does not exist" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.383676 4762 scope.go:117] "RemoveContainer" containerID="3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e" Feb 17 18:01:30 crc kubenswrapper[4762]: E0217 18:01:30.383989 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e\": container with ID starting with 3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e not found: ID does not exist" containerID="3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e" Feb 17 18:01:30 crc kubenswrapper[4762]: I0217 18:01:30.384031 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e"} err="failed to get container status \"3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e\": rpc error: code = NotFound desc = could not find container \"3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e\": container with ID starting with 3e539f0f122a0303e0543dbb60f6886ab60775ef0e9ecc1fdf985f7607adfc1e not found: ID does not exist" Feb 17 18:01:31 crc kubenswrapper[4762]: I0217 18:01:31.048319 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" path="/var/lib/kubelet/pods/92bc7402-f73e-4cc5-980d-10c3043b3ba8/volumes" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.265819 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p"] Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266661 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="extract-utilities" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266676 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="extract-utilities" Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266687 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="util" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266695 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="util" Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266704 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="pull" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266711 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="pull" Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266723 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="extract-content" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266730 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="extract-content" Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266749 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="registry-server" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266756 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="registry-server" Feb 17 18:01:35 crc kubenswrapper[4762]: E0217 18:01:35.266768 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="extract" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266774 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="extract" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266893 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bc7402-f73e-4cc5-980d-10c3043b3ba8" containerName="registry-server" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.266904 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="12aa14d1-1ff5-4325-8792-d43cfd40cf96" containerName="extract" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.267373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.269590 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jkll8" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.270368 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.290433 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p"] Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.350800 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-apiservice-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.350854 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-webhook-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.350893 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9cm4\" (UniqueName: \"kubernetes.io/projected/05cb543d-eddd-4628-a8bc-168e3a7e5b48-kube-api-access-j9cm4\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.451743 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-apiservice-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.451886 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-webhook-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.451971 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9cm4\" (UniqueName: \"kubernetes.io/projected/05cb543d-eddd-4628-a8bc-168e3a7e5b48-kube-api-access-j9cm4\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.463644 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-webhook-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.463710 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05cb543d-eddd-4628-a8bc-168e3a7e5b48-apiservice-cert\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.470722 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9cm4\" (UniqueName: \"kubernetes.io/projected/05cb543d-eddd-4628-a8bc-168e3a7e5b48-kube-api-access-j9cm4\") pod \"infra-operator-controller-manager-69b84c89c7-gd74p\" (UID: \"05cb543d-eddd-4628-a8bc-168e3a7e5b48\") " pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.586506 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:35 crc kubenswrapper[4762]: I0217 18:01:35.781284 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p"] Feb 17 18:01:35 crc kubenswrapper[4762]: W0217 18:01:35.788839 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05cb543d_eddd_4628_a8bc_168e3a7e5b48.slice/crio-56cde9665b86720d9ddd2f6d7e97b8a481cc638a8e1893bf2332423b0c464f54 WatchSource:0}: Error finding container 56cde9665b86720d9ddd2f6d7e97b8a481cc638a8e1893bf2332423b0c464f54: Status 404 returned error can't find the container with id 56cde9665b86720d9ddd2f6d7e97b8a481cc638a8e1893bf2332423b0c464f54 Feb 17 18:01:36 crc kubenswrapper[4762]: I0217 18:01:36.351987 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" event={"ID":"05cb543d-eddd-4628-a8bc-168e3a7e5b48","Type":"ContainerStarted","Data":"56cde9665b86720d9ddd2f6d7e97b8a481cc638a8e1893bf2332423b0c464f54"} Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.367105 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" event={"ID":"05cb543d-eddd-4628-a8bc-168e3a7e5b48","Type":"ContainerStarted","Data":"bd9a07fa782d77bcb78c0b48a1754fefcb432abcd272daaefb4d0c077e40c4ca"} Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.367501 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.385321 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" podStartSLOduration=1.622002264 podStartE2EDuration="3.385304565s" podCreationTimestamp="2026-02-17 18:01:35 +0000 UTC" firstStartedPulling="2026-02-17 18:01:35.791467126 +0000 UTC m=+847.436385136" lastFinishedPulling="2026-02-17 18:01:37.554769427 +0000 UTC m=+849.199687437" observedRunningTime="2026-02-17 18:01:38.385043488 +0000 UTC m=+850.029961508" watchObservedRunningTime="2026-02-17 18:01:38.385304565 +0000 UTC m=+850.030222575" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.698655 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.699990 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.702206 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.702222 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-lctbl" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.702358 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.703188 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.713503 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.714425 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.715926 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.720990 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.721892 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.724110 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.729863 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.734429 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.801872 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvfd\" (UniqueName: \"kubernetes.io/projected/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kube-api-access-jjvfd\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.801934 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6f6\" (UniqueName: \"kubernetes.io/projected/1f247f60-b429-4a5b-81c5-61f533de7ef9-kube-api-access-7c6f6\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.801957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-default\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.801974 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-config-data-default\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802094 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-kolla-config\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802145 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802359 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-operator-scripts\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802382 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-generated\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-default\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-kolla-config\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802540 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802656 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kolla-config\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802730 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvwk\" (UniqueName: \"kubernetes.io/projected/f8e941eb-7039-4a71-88df-914907d84acb-kube-api-access-tfvwk\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802752 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802827 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.802891 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f8e941eb-7039-4a71-88df-914907d84acb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904163 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904210 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kolla-config\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904233 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvwk\" (UniqueName: \"kubernetes.io/projected/f8e941eb-7039-4a71-88df-914907d84acb-kube-api-access-tfvwk\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904249 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904275 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f8e941eb-7039-4a71-88df-914907d84acb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904338 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvfd\" (UniqueName: \"kubernetes.io/projected/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kube-api-access-jjvfd\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904380 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6f6\" (UniqueName: \"kubernetes.io/projected/1f247f60-b429-4a5b-81c5-61f533de7ef9-kube-api-access-7c6f6\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-default\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-config-data-default\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904456 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-kolla-config\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904533 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-operator-scripts\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904613 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-generated\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904658 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-default\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-kolla-config\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904716 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.904890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.905044 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f8e941eb-7039-4a71-88df-914907d84acb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.905277 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kolla-config\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.905369 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.905581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-generated\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.905900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-kolla-config\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.906087 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-config-data-default\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.906257 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-config-data-default\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.906300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-kolla-config\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.906319 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-config-data-default\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.906763 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e941eb-7039-4a71-88df-914907d84acb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.907142 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-operator-scripts\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.907223 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.907404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f247f60-b429-4a5b-81c5-61f533de7ef9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.924664 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.929515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.929541 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvfd\" (UniqueName: \"kubernetes.io/projected/c0dd6fbc-c7a8-46fe-aceb-25e59e083854-kube-api-access-jjvfd\") pod \"openstack-galera-1\" (UID: \"c0dd6fbc-c7a8-46fe-aceb-25e59e083854\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.932083 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvwk\" (UniqueName: \"kubernetes.io/projected/f8e941eb-7039-4a71-88df-914907d84acb-kube-api-access-tfvwk\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.932196 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"f8e941eb-7039-4a71-88df-914907d84acb\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:38 crc kubenswrapper[4762]: I0217 18:01:38.932845 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6f6\" (UniqueName: \"kubernetes.io/projected/1f247f60-b429-4a5b-81c5-61f533de7ef9-kube-api-access-7c6f6\") pod \"openstack-galera-0\" (UID: \"1f247f60-b429-4a5b-81c5-61f533de7ef9\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.025437 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.048937 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.053247 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.318819 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.377256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"c0dd6fbc-c7a8-46fe-aceb-25e59e083854","Type":"ContainerStarted","Data":"16ffeaa5354480e0f1b1be4da8a5b9caf313025d73c7ba9cb2ebafff32582698"} Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.383366 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 18:01:39 crc kubenswrapper[4762]: W0217 18:01:39.426383 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e941eb_7039_4a71_88df_914907d84acb.slice/crio-78b7c7c02bd8fc6a8d03d0e5fa6ba659d00b747a16d645cef51e3eba0e29e58a WatchSource:0}: Error finding container 78b7c7c02bd8fc6a8d03d0e5fa6ba659d00b747a16d645cef51e3eba0e29e58a: Status 404 returned error can't find the container with id 78b7c7c02bd8fc6a8d03d0e5fa6ba659d00b747a16d645cef51e3eba0e29e58a Feb 17 18:01:39 crc kubenswrapper[4762]: I0217 18:01:39.472872 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 18:01:39 crc kubenswrapper[4762]: W0217 18:01:39.482158 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f247f60_b429_4a5b_81c5_61f533de7ef9.slice/crio-a945e6feeb60330843290bf054796056ad7c74ce8d42d941f0609ea39e766328 WatchSource:0}: Error finding container a945e6feeb60330843290bf054796056ad7c74ce8d42d941f0609ea39e766328: Status 404 returned error can't find the container with id a945e6feeb60330843290bf054796056ad7c74ce8d42d941f0609ea39e766328 Feb 17 18:01:40 crc kubenswrapper[4762]: I0217 18:01:40.397659 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f8e941eb-7039-4a71-88df-914907d84acb","Type":"ContainerStarted","Data":"78b7c7c02bd8fc6a8d03d0e5fa6ba659d00b747a16d645cef51e3eba0e29e58a"} Feb 17 18:01:40 crc kubenswrapper[4762]: I0217 18:01:40.400273 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"1f247f60-b429-4a5b-81c5-61f533de7ef9","Type":"ContainerStarted","Data":"a945e6feeb60330843290bf054796056ad7c74ce8d42d941f0609ea39e766328"} Feb 17 18:01:45 crc kubenswrapper[4762]: I0217 18:01:45.590847 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-69b84c89c7-gd74p" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.447937 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f8e941eb-7039-4a71-88df-914907d84acb","Type":"ContainerStarted","Data":"ce86fff90f848d8a356969c4f67a5b901d027ce93f7d63bc2da8c2ee49bd09f4"} Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.449289 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"1f247f60-b429-4a5b-81c5-61f533de7ef9","Type":"ContainerStarted","Data":"cb8bb978261f587e2e921100d3f694e6abc1b1bf414fa6b4a316f71d04deb013"} Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.450828 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"c0dd6fbc-c7a8-46fe-aceb-25e59e083854","Type":"ContainerStarted","Data":"83efb7900d6609f80cabbb1f732e7474b1b4b69974d9926770c7efaf38d71cd1"} Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.840311 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.841039 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.846143 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.846358 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-wjqhb" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.853098 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.884856 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9q96\" (UniqueName: \"kubernetes.io/projected/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kube-api-access-l9q96\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.885083 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-config-data\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.885149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kolla-config\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.986369 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9q96\" (UniqueName: \"kubernetes.io/projected/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kube-api-access-l9q96\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.986539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-config-data\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.986575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kolla-config\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.987603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kolla-config\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:48 crc kubenswrapper[4762]: I0217 18:01:48.987640 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-config-data\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:49 crc kubenswrapper[4762]: I0217 18:01:49.003146 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9q96\" (UniqueName: \"kubernetes.io/projected/bc139701-f0d8-4dd3-8724-69e3e8f42e5f-kube-api-access-l9q96\") pod \"memcached-0\" (UID: \"bc139701-f0d8-4dd3-8724-69e3e8f42e5f\") " pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:49 crc kubenswrapper[4762]: I0217 18:01:49.158104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:49 crc kubenswrapper[4762]: I0217 18:01:49.619513 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 18:01:49 crc kubenswrapper[4762]: W0217 18:01:49.620847 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc139701_f0d8_4dd3_8724_69e3e8f42e5f.slice/crio-8e66b13683a3bddb0fbe6dccd2e793899ae651c3ed527fe04279e2a21d977b1b WatchSource:0}: Error finding container 8e66b13683a3bddb0fbe6dccd2e793899ae651c3ed527fe04279e2a21d977b1b: Status 404 returned error can't find the container with id 8e66b13683a3bddb0fbe6dccd2e793899ae651c3ed527fe04279e2a21d977b1b Feb 17 18:01:50 crc kubenswrapper[4762]: I0217 18:01:50.463000 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"bc139701-f0d8-4dd3-8724-69e3e8f42e5f","Type":"ContainerStarted","Data":"8e66b13683a3bddb0fbe6dccd2e793899ae651c3ed527fe04279e2a21d977b1b"} Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.694737 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9j27d"] Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.695747 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.698072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-8nltb" Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.707212 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9j27d"] Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.747399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vdz\" (UniqueName: \"kubernetes.io/projected/2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61-kube-api-access-n2vdz\") pod \"rabbitmq-cluster-operator-index-9j27d\" (UID: \"2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.848595 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vdz\" (UniqueName: \"kubernetes.io/projected/2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61-kube-api-access-n2vdz\") pod \"rabbitmq-cluster-operator-index-9j27d\" (UID: \"2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:01:51 crc kubenswrapper[4762]: I0217 18:01:51.877581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vdz\" (UniqueName: \"kubernetes.io/projected/2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61-kube-api-access-n2vdz\") pod \"rabbitmq-cluster-operator-index-9j27d\" (UID: \"2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.156914 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.475514 4762 generic.go:334] "Generic (PLEG): container finished" podID="f8e941eb-7039-4a71-88df-914907d84acb" containerID="ce86fff90f848d8a356969c4f67a5b901d027ce93f7d63bc2da8c2ee49bd09f4" exitCode=0 Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.475672 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f8e941eb-7039-4a71-88df-914907d84acb","Type":"ContainerDied","Data":"ce86fff90f848d8a356969c4f67a5b901d027ce93f7d63bc2da8c2ee49bd09f4"} Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.477482 4762 generic.go:334] "Generic (PLEG): container finished" podID="1f247f60-b429-4a5b-81c5-61f533de7ef9" containerID="cb8bb978261f587e2e921100d3f694e6abc1b1bf414fa6b4a316f71d04deb013" exitCode=0 Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.477549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"1f247f60-b429-4a5b-81c5-61f533de7ef9","Type":"ContainerDied","Data":"cb8bb978261f587e2e921100d3f694e6abc1b1bf414fa6b4a316f71d04deb013"} Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.480727 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"bc139701-f0d8-4dd3-8724-69e3e8f42e5f","Type":"ContainerStarted","Data":"fd088ec306a4eb3febf6264df81aa5cf23b5a174ebf65b9bc241d30eb373f112"} Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.481292 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.484638 4762 generic.go:334] "Generic (PLEG): container finished" podID="c0dd6fbc-c7a8-46fe-aceb-25e59e083854" containerID="83efb7900d6609f80cabbb1f732e7474b1b4b69974d9926770c7efaf38d71cd1" exitCode=0 Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.484706 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"c0dd6fbc-c7a8-46fe-aceb-25e59e083854","Type":"ContainerDied","Data":"83efb7900d6609f80cabbb1f732e7474b1b4b69974d9926770c7efaf38d71cd1"} Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.602490 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.861292659 podStartE2EDuration="4.602446479s" podCreationTimestamp="2026-02-17 18:01:48 +0000 UTC" firstStartedPulling="2026-02-17 18:01:49.622421768 +0000 UTC m=+861.267339768" lastFinishedPulling="2026-02-17 18:01:51.363575568 +0000 UTC m=+863.008493588" observedRunningTime="2026-02-17 18:01:52.578647687 +0000 UTC m=+864.223565707" watchObservedRunningTime="2026-02-17 18:01:52.602446479 +0000 UTC m=+864.247364479" Feb 17 18:01:52 crc kubenswrapper[4762]: I0217 18:01:52.677016 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9j27d"] Feb 17 18:01:52 crc kubenswrapper[4762]: W0217 18:01:52.688567 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8ca2b8_ee46_4ebf_a619_8fcdab8d2c61.slice/crio-e7e257ccb7568957d33e748349e6f1fd4ed6522b8a3ba93d459c1269446a4440 WatchSource:0}: Error finding container e7e257ccb7568957d33e748349e6f1fd4ed6522b8a3ba93d459c1269446a4440: Status 404 returned error can't find the container with id e7e257ccb7568957d33e748349e6f1fd4ed6522b8a3ba93d459c1269446a4440 Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.497674 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"c0dd6fbc-c7a8-46fe-aceb-25e59e083854","Type":"ContainerStarted","Data":"c989065f508e3f69e80dd47599711d0e375ee8d35bbb6ee9ee085d8cfcade8ca"} Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.509748 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" event={"ID":"2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61","Type":"ContainerStarted","Data":"e7e257ccb7568957d33e748349e6f1fd4ed6522b8a3ba93d459c1269446a4440"} Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.523193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f8e941eb-7039-4a71-88df-914907d84acb","Type":"ContainerStarted","Data":"c91d99bef82590ceee57268e392350aa39bcc7d142d5a059d7304e3c74e7f841"} Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.536337 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=8.71786084 podStartE2EDuration="16.536315792s" podCreationTimestamp="2026-02-17 18:01:37 +0000 UTC" firstStartedPulling="2026-02-17 18:01:39.331470648 +0000 UTC m=+850.976388658" lastFinishedPulling="2026-02-17 18:01:47.1499256 +0000 UTC m=+858.794843610" observedRunningTime="2026-02-17 18:01:53.530569336 +0000 UTC m=+865.175487356" watchObservedRunningTime="2026-02-17 18:01:53.536315792 +0000 UTC m=+865.181233812" Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.556299 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"1f247f60-b429-4a5b-81c5-61f533de7ef9","Type":"ContainerStarted","Data":"9efff93defd8a6925d679097c8de0b1499f1bc65d2a17b6c0d3dfae906d07364"} Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.560223 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=8.81864642 podStartE2EDuration="16.560210156s" podCreationTimestamp="2026-02-17 18:01:37 +0000 UTC" firstStartedPulling="2026-02-17 18:01:39.427754285 +0000 UTC m=+851.072672295" lastFinishedPulling="2026-02-17 18:01:47.169318021 +0000 UTC m=+858.814236031" observedRunningTime="2026-02-17 18:01:53.559848907 +0000 UTC m=+865.204766917" watchObservedRunningTime="2026-02-17 18:01:53.560210156 +0000 UTC m=+865.205128166" Feb 17 18:01:53 crc kubenswrapper[4762]: I0217 18:01:53.612083 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=8.91386005 podStartE2EDuration="16.612065528s" podCreationTimestamp="2026-02-17 18:01:37 +0000 UTC" firstStartedPulling="2026-02-17 18:01:39.484119152 +0000 UTC m=+851.129037162" lastFinishedPulling="2026-02-17 18:01:47.18232463 +0000 UTC m=+858.827242640" observedRunningTime="2026-02-17 18:01:53.608201341 +0000 UTC m=+865.253119351" watchObservedRunningTime="2026-02-17 18:01:53.612065528 +0000 UTC m=+865.256983538" Feb 17 18:01:57 crc kubenswrapper[4762]: I0217 18:01:57.591306 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" event={"ID":"2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61","Type":"ContainerStarted","Data":"bdffca2b25cafa84d628e21b0646e305e135b1897648f0c49fd772ee6699fe8a"} Feb 17 18:01:57 crc kubenswrapper[4762]: I0217 18:01:57.608607 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" podStartSLOduration=2.7711241429999998 podStartE2EDuration="6.608585963s" podCreationTimestamp="2026-02-17 18:01:51 +0000 UTC" firstStartedPulling="2026-02-17 18:01:52.690379994 +0000 UTC m=+864.335298004" lastFinishedPulling="2026-02-17 18:01:56.527841814 +0000 UTC m=+868.172759824" observedRunningTime="2026-02-17 18:01:57.606570752 +0000 UTC m=+869.251488762" watchObservedRunningTime="2026-02-17 18:01:57.608585963 +0000 UTC m=+869.253503983" Feb 17 18:01:58 crc kubenswrapper[4762]: E0217 18:01:58.597286 4762 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.195:41570->38.102.83.195:35251: write tcp 38.102.83.195:41570->38.102.83.195:35251: write: broken pipe Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.026597 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.026670 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.050443 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.050726 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.054142 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.054947 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:01:59 crc kubenswrapper[4762]: I0217 18:01:59.159251 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.302291 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.305302 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.328500 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.377410 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9nc\" (UniqueName: \"kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.377601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.377778 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.479672 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.479756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.479800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9nc\" (UniqueName: \"kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.480516 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.480572 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.505951 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9nc\" (UniqueName: \"kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc\") pod \"community-operators-jb6j9\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.637368 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:01 crc kubenswrapper[4762]: I0217 18:02:01.929716 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.033715 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.086264 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.157909 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.158256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.186456 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.619495 4762 generic.go:334] "Generic (PLEG): container finished" podID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerID="106c9d76fc8a4be07ceb330721794d7933a6d8b9d41b77f55b4cd7e3895ba19e" exitCode=0 Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.619600 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerDied","Data":"106c9d76fc8a4be07ceb330721794d7933a6d8b9d41b77f55b4cd7e3895ba19e"} Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.619697 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerStarted","Data":"5c894af7d4fa736b83e496d52b0b394f1ed3975b41346b762d0808a06f5bb6d3"} Feb 17 18:02:02 crc kubenswrapper[4762]: I0217 18:02:02.642057 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-9j27d" Feb 17 18:02:03 crc kubenswrapper[4762]: I0217 18:02:03.628943 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerStarted","Data":"1e44d43e9aa9a4a29877429c1a9dbfc9ec3acfc669a15ac0192dc44c358a7d63"} Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.635494 4762 generic.go:334] "Generic (PLEG): container finished" podID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerID="1e44d43e9aa9a4a29877429c1a9dbfc9ec3acfc669a15ac0192dc44c358a7d63" exitCode=0 Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.635608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerDied","Data":"1e44d43e9aa9a4a29877429c1a9dbfc9ec3acfc669a15ac0192dc44c358a7d63"} Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.729504 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs"] Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.730634 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.732862 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.740864 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs"] Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.827341 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.827400 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.827459 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnb5m\" (UniqueName: \"kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.928736 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.928774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.928828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnb5m\" (UniqueName: \"kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.929171 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.929258 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:04 crc kubenswrapper[4762]: I0217 18:02:04.947323 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnb5m\" (UniqueName: \"kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:05 crc kubenswrapper[4762]: I0217 18:02:05.048104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:05 crc kubenswrapper[4762]: I0217 18:02:05.465725 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs"] Feb 17 18:02:05 crc kubenswrapper[4762]: I0217 18:02:05.642976 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerStarted","Data":"42c0caf7041856d9f94bc2243d35734c69da02ca40f3398e1f2c47209e7b0e46"} Feb 17 18:02:05 crc kubenswrapper[4762]: I0217 18:02:05.644164 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" event={"ID":"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660","Type":"ContainerStarted","Data":"6619a2fe5e5be69d9df0a538ee5a01dd38e59404be34e99fec9d9253c6daeb22"} Feb 17 18:02:05 crc kubenswrapper[4762]: I0217 18:02:05.663047 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jb6j9" podStartSLOduration=2.236306684 podStartE2EDuration="4.663024665s" podCreationTimestamp="2026-02-17 18:02:01 +0000 UTC" firstStartedPulling="2026-02-17 18:02:02.621169849 +0000 UTC m=+874.266087859" lastFinishedPulling="2026-02-17 18:02:05.04788783 +0000 UTC m=+876.692805840" observedRunningTime="2026-02-17 18:02:05.659314701 +0000 UTC m=+877.304232711" watchObservedRunningTime="2026-02-17 18:02:05.663024665 +0000 UTC m=+877.307942685" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.784279 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-gtlzz"] Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.788320 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.790962 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.791738 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-gtlzz"] Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.872412 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t249f\" (UniqueName: \"kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.872449 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.973252 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t249f\" (UniqueName: \"kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.973290 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.974150 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:07.998483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t249f\" (UniqueName: \"kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f\") pod \"root-account-create-update-gtlzz\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:08.112411 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.125498 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="f8e941eb-7039-4a71-88df-914907d84acb" containerName="galera" probeResult="failure" output=< Feb 17 18:02:09 crc kubenswrapper[4762]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Feb 17 18:02:09 crc kubenswrapper[4762]: > Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.427935 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-gtlzz"] Feb 17 18:02:09 crc kubenswrapper[4762]: W0217 18:02:09.441403 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod885f2c17_dddb_4f85_90bf_90ba0e38255a.slice/crio-e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda WatchSource:0}: Error finding container e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda: Status 404 returned error can't find the container with id e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.670338 4762 generic.go:334] "Generic (PLEG): container finished" podID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerID="c06a421834ff44cb87eeb6b96067a56f13be118bfa555b2042e399cf92e1f64e" exitCode=0 Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.670414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" event={"ID":"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660","Type":"ContainerDied","Data":"c06a421834ff44cb87eeb6b96067a56f13be118bfa555b2042e399cf92e1f64e"} Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.673560 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-gtlzz" event={"ID":"885f2c17-dddb-4f85-90bf-90ba0e38255a","Type":"ContainerStarted","Data":"fd134a566ddf830408d2235b5505933ed2c746d686d40322303111272f2ca1b1"} Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.673611 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-gtlzz" event={"ID":"885f2c17-dddb-4f85-90bf-90ba0e38255a","Type":"ContainerStarted","Data":"e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda"} Feb 17 18:02:09 crc kubenswrapper[4762]: I0217 18:02:09.711444 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/root-account-create-update-gtlzz" podStartSLOduration=2.711422233 podStartE2EDuration="2.711422233s" podCreationTimestamp="2026-02-17 18:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:09.706456617 +0000 UTC m=+881.351374627" watchObservedRunningTime="2026-02-17 18:02:09.711422233 +0000 UTC m=+881.356340243" Feb 17 18:02:10 crc kubenswrapper[4762]: I0217 18:02:10.680633 4762 generic.go:334] "Generic (PLEG): container finished" podID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerID="fd8e4820c22f4f05b0c89f32256118547ebdbcbd2c030e76b4d61be8f6ba412d" exitCode=0 Feb 17 18:02:10 crc kubenswrapper[4762]: I0217 18:02:10.680831 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" event={"ID":"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660","Type":"ContainerDied","Data":"fd8e4820c22f4f05b0c89f32256118547ebdbcbd2c030e76b4d61be8f6ba412d"} Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.638137 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.638487 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.685965 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.690776 4762 generic.go:334] "Generic (PLEG): container finished" podID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerID="1a20c3033a68fac2556e0d9cdbfaced2565fddb0d502961cd7e0b24a37d8cd52" exitCode=0 Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.690845 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" event={"ID":"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660","Type":"ContainerDied","Data":"1a20c3033a68fac2556e0d9cdbfaced2565fddb0d502961cd7e0b24a37d8cd52"} Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.692442 4762 generic.go:334] "Generic (PLEG): container finished" podID="885f2c17-dddb-4f85-90bf-90ba0e38255a" containerID="fd134a566ddf830408d2235b5505933ed2c746d686d40322303111272f2ca1b1" exitCode=0 Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.692517 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-gtlzz" event={"ID":"885f2c17-dddb-4f85-90bf-90ba0e38255a","Type":"ContainerDied","Data":"fd134a566ddf830408d2235b5505933ed2c746d686d40322303111272f2ca1b1"} Feb 17 18:02:11 crc kubenswrapper[4762]: I0217 18:02:11.733766 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:12 crc kubenswrapper[4762]: I0217 18:02:12.979467 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.035338 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle\") pod \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.035482 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util\") pod \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.035527 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnb5m\" (UniqueName: \"kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m\") pod \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\" (UID: \"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660\") " Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.036537 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle" (OuterVolumeSpecName: "bundle") pod "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" (UID: "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.042496 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m" (OuterVolumeSpecName: "kube-api-access-hnb5m") pod "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" (UID: "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660"). InnerVolumeSpecName "kube-api-access-hnb5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.048427 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util" (OuterVolumeSpecName: "util") pod "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" (UID: "b8e92bbe-0a6e-470d-8fcb-d774f8ae3660"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.078933 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.136655 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts\") pod \"885f2c17-dddb-4f85-90bf-90ba0e38255a\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.136777 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t249f\" (UniqueName: \"kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f\") pod \"885f2c17-dddb-4f85-90bf-90ba0e38255a\" (UID: \"885f2c17-dddb-4f85-90bf-90ba0e38255a\") " Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.137078 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.137090 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnb5m\" (UniqueName: \"kubernetes.io/projected/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-kube-api-access-hnb5m\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.137100 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e92bbe-0a6e-470d-8fcb-d774f8ae3660-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.138165 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "885f2c17-dddb-4f85-90bf-90ba0e38255a" (UID: "885f2c17-dddb-4f85-90bf-90ba0e38255a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.144902 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f" (OuterVolumeSpecName: "kube-api-access-t249f") pod "885f2c17-dddb-4f85-90bf-90ba0e38255a" (UID: "885f2c17-dddb-4f85-90bf-90ba0e38255a"). InnerVolumeSpecName "kube-api-access-t249f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.238478 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t249f\" (UniqueName: \"kubernetes.io/projected/885f2c17-dddb-4f85-90bf-90ba0e38255a-kube-api-access-t249f\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.238519 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885f2c17-dddb-4f85-90bf-90ba0e38255a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.707990 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" event={"ID":"b8e92bbe-0a6e-470d-8fcb-d774f8ae3660","Type":"ContainerDied","Data":"6619a2fe5e5be69d9df0a538ee5a01dd38e59404be34e99fec9d9253c6daeb22"} Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.708366 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6619a2fe5e5be69d9df0a538ee5a01dd38e59404be34e99fec9d9253c6daeb22" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.708028 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.709515 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-gtlzz" event={"ID":"885f2c17-dddb-4f85-90bf-90ba0e38255a","Type":"ContainerDied","Data":"e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda"} Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.709566 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e14b50062a447c29c5f89d94ce2f9582409ba222b1a8395e93bed0ee54504fda" Feb 17 18:02:13 crc kubenswrapper[4762]: I0217 18:02:13.709557 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-gtlzz" Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.495519 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.496613 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jb6j9" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="registry-server" containerID="cri-o://42c0caf7041856d9f94bc2243d35734c69da02ca40f3398e1f2c47209e7b0e46" gracePeriod=2 Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.542732 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.691417 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.730074 4762 generic.go:334] "Generic (PLEG): container finished" podID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerID="42c0caf7041856d9f94bc2243d35734c69da02ca40f3398e1f2c47209e7b0e46" exitCode=0 Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.730344 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerDied","Data":"42c0caf7041856d9f94bc2243d35734c69da02ca40f3398e1f2c47209e7b0e46"} Feb 17 18:02:14 crc kubenswrapper[4762]: I0217 18:02:14.938638 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.062675 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content\") pod \"96fbfab0-da5b-4690-9cde-95141db9bc4e\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.062872 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s9nc\" (UniqueName: \"kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc\") pod \"96fbfab0-da5b-4690-9cde-95141db9bc4e\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.064441 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities\") pod \"96fbfab0-da5b-4690-9cde-95141db9bc4e\" (UID: \"96fbfab0-da5b-4690-9cde-95141db9bc4e\") " Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.065350 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities" (OuterVolumeSpecName: "utilities") pod "96fbfab0-da5b-4690-9cde-95141db9bc4e" (UID: "96fbfab0-da5b-4690-9cde-95141db9bc4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.069718 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc" (OuterVolumeSpecName: "kube-api-access-2s9nc") pod "96fbfab0-da5b-4690-9cde-95141db9bc4e" (UID: "96fbfab0-da5b-4690-9cde-95141db9bc4e"). InnerVolumeSpecName "kube-api-access-2s9nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.114704 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96fbfab0-da5b-4690-9cde-95141db9bc4e" (UID: "96fbfab0-da5b-4690-9cde-95141db9bc4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.166614 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.166673 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96fbfab0-da5b-4690-9cde-95141db9bc4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.166689 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s9nc\" (UniqueName: \"kubernetes.io/projected/96fbfab0-da5b-4690-9cde-95141db9bc4e-kube-api-access-2s9nc\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.739130 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6j9" event={"ID":"96fbfab0-da5b-4690-9cde-95141db9bc4e","Type":"ContainerDied","Data":"5c894af7d4fa736b83e496d52b0b394f1ed3975b41346b762d0808a06f5bb6d3"} Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.739197 4762 scope.go:117] "RemoveContainer" containerID="42c0caf7041856d9f94bc2243d35734c69da02ca40f3398e1f2c47209e7b0e46" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.739208 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6j9" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.741005 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.771001 4762 scope.go:117] "RemoveContainer" containerID="1e44d43e9aa9a4a29877429c1a9dbfc9ec3acfc669a15ac0192dc44c358a7d63" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.790053 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.790960 4762 scope.go:117] "RemoveContainer" containerID="106c9d76fc8a4be07ceb330721794d7933a6d8b9d41b77f55b4cd7e3895ba19e" Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.795916 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jb6j9"] Feb 17 18:02:15 crc kubenswrapper[4762]: I0217 18:02:15.818446 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 18:02:17 crc kubenswrapper[4762]: I0217 18:02:17.054460 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" path="/var/lib/kubelet/pods/96fbfab0-da5b-4690-9cde-95141db9bc4e/volumes" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.560551 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n"] Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561356 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="extract" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561373 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="extract" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561390 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="util" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561398 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="util" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561417 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="extract-content" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561426 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="extract-content" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561441 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885f2c17-dddb-4f85-90bf-90ba0e38255a" containerName="mariadb-account-create-update" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561450 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="885f2c17-dddb-4f85-90bf-90ba0e38255a" containerName="mariadb-account-create-update" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561464 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="pull" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561471 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="pull" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561481 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="extract-utilities" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561489 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="extract-utilities" Feb 17 18:02:27 crc kubenswrapper[4762]: E0217 18:02:27.561504 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="registry-server" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561511 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="registry-server" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561680 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fbfab0-da5b-4690-9cde-95141db9bc4e" containerName="registry-server" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561697 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="885f2c17-dddb-4f85-90bf-90ba0e38255a" containerName="mariadb-account-create-update" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.561707 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e92bbe-0a6e-470d-8fcb-d774f8ae3660" containerName="extract" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.562223 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.564339 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-vp7rb" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.573708 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n"] Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.729501 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrwk\" (UniqueName: \"kubernetes.io/projected/3c5f4f80-b6f2-47d5-a966-2f19b2911a99-kube-api-access-vjrwk\") pod \"rabbitmq-cluster-operator-779fc9694b-v4s4n\" (UID: \"3c5f4f80-b6f2-47d5-a966-2f19b2911a99\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.830905 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrwk\" (UniqueName: \"kubernetes.io/projected/3c5f4f80-b6f2-47d5-a966-2f19b2911a99-kube-api-access-vjrwk\") pod \"rabbitmq-cluster-operator-779fc9694b-v4s4n\" (UID: \"3c5f4f80-b6f2-47d5-a966-2f19b2911a99\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.850237 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrwk\" (UniqueName: \"kubernetes.io/projected/3c5f4f80-b6f2-47d5-a966-2f19b2911a99-kube-api-access-vjrwk\") pod \"rabbitmq-cluster-operator-779fc9694b-v4s4n\" (UID: \"3c5f4f80-b6f2-47d5-a966-2f19b2911a99\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" Feb 17 18:02:27 crc kubenswrapper[4762]: I0217 18:02:27.878442 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" Feb 17 18:02:28 crc kubenswrapper[4762]: I0217 18:02:28.308572 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n"] Feb 17 18:02:28 crc kubenswrapper[4762]: I0217 18:02:28.817015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" event={"ID":"3c5f4f80-b6f2-47d5-a966-2f19b2911a99","Type":"ContainerStarted","Data":"89c41ee290d9c272f65d1d415760d020d2be220d6ef60106b222d079a1e8e9a3"} Feb 17 18:02:32 crc kubenswrapper[4762]: I0217 18:02:32.842505 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" event={"ID":"3c5f4f80-b6f2-47d5-a966-2f19b2911a99","Type":"ContainerStarted","Data":"38688021a870c0d0a971a17f86c0bdf78e5a4e4e50146f5fc3253bf7a05f2627"} Feb 17 18:02:32 crc kubenswrapper[4762]: I0217 18:02:32.857719 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-v4s4n" podStartSLOduration=2.444351348 podStartE2EDuration="5.857697245s" podCreationTimestamp="2026-02-17 18:02:27 +0000 UTC" firstStartedPulling="2026-02-17 18:02:28.32055509 +0000 UTC m=+899.965473110" lastFinishedPulling="2026-02-17 18:02:31.733900997 +0000 UTC m=+903.378819007" observedRunningTime="2026-02-17 18:02:32.853901919 +0000 UTC m=+904.498819949" watchObservedRunningTime="2026-02-17 18:02:32.857697245 +0000 UTC m=+904.502615275" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.493726 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-j2hm8"] Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.494932 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.497380 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-547d9" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.501596 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-j2hm8"] Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.656579 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-596sb\" (UniqueName: \"kubernetes.io/projected/66cbf86e-4179-4923-9177-343729807287-kube-api-access-596sb\") pod \"keystone-operator-index-j2hm8\" (UID: \"66cbf86e-4179-4923-9177-343729807287\") " pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.758185 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-596sb\" (UniqueName: \"kubernetes.io/projected/66cbf86e-4179-4923-9177-343729807287-kube-api-access-596sb\") pod \"keystone-operator-index-j2hm8\" (UID: \"66cbf86e-4179-4923-9177-343729807287\") " pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.779100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-596sb\" (UniqueName: \"kubernetes.io/projected/66cbf86e-4179-4923-9177-343729807287-kube-api-access-596sb\") pod \"keystone-operator-index-j2hm8\" (UID: \"66cbf86e-4179-4923-9177-343729807287\") " pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:36 crc kubenswrapper[4762]: I0217 18:02:36.809395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:37 crc kubenswrapper[4762]: I0217 18:02:37.207994 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-j2hm8"] Feb 17 18:02:37 crc kubenswrapper[4762]: I0217 18:02:37.878057 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-j2hm8" event={"ID":"66cbf86e-4179-4923-9177-343729807287","Type":"ContainerStarted","Data":"5cfdc73322a6702aef0df902d8775e925ae1fa7b4db5c9affd047fdd6bdf0578"} Feb 17 18:02:38 crc kubenswrapper[4762]: I0217 18:02:38.889101 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-j2hm8" event={"ID":"66cbf86e-4179-4923-9177-343729807287","Type":"ContainerStarted","Data":"4852612ac2fcd3c10082e621920f6eb81de2543983b53c3331f330515d7db35d"} Feb 17 18:02:38 crc kubenswrapper[4762]: I0217 18:02:38.927256 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-j2hm8" podStartSLOduration=2.09912701 podStartE2EDuration="2.927235196s" podCreationTimestamp="2026-02-17 18:02:36 +0000 UTC" firstStartedPulling="2026-02-17 18:02:37.219064153 +0000 UTC m=+908.863982163" lastFinishedPulling="2026-02-17 18:02:38.047172349 +0000 UTC m=+909.692090349" observedRunningTime="2026-02-17 18:02:38.91913743 +0000 UTC m=+910.564055440" watchObservedRunningTime="2026-02-17 18:02:38.927235196 +0000 UTC m=+910.572153206" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.117403 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.118608 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.121715 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.121714 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.121982 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.121883 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.123543 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-mmf26" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.143181 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206034 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9a34938-3950-4fa5-a14d-30feb52b752e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206146 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206176 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vh6d\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-kube-api-access-6vh6d\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206226 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206249 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9a34938-3950-4fa5-a14d-30feb52b752e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.206386 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9a34938-3950-4fa5-a14d-30feb52b752e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vh6d\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-kube-api-access-6vh6d\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307709 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307731 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307753 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307811 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9a34938-3950-4fa5-a14d-30feb52b752e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307835 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9a34938-3950-4fa5-a14d-30feb52b752e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.307863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9a34938-3950-4fa5-a14d-30feb52b752e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.308313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.308394 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.308757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.309296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9a34938-3950-4fa5-a14d-30feb52b752e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.310799 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.310943 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c2b9df89b6bf7cbd87a4c0209bd2881582999f86459a59458912fb3b0637b31/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.313535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9a34938-3950-4fa5-a14d-30feb52b752e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.327548 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9a34938-3950-4fa5-a14d-30feb52b752e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.329417 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.339801 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vh6d\" (UniqueName: \"kubernetes.io/projected/d9a34938-3950-4fa5-a14d-30feb52b752e-kube-api-access-6vh6d\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.346738 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a929313e-0845-4cee-a1c8-ab8b3a7c1099\") pod \"rabbitmq-server-0\" (UID: \"d9a34938-3950-4fa5-a14d-30feb52b752e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.437279 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.685764 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 18:02:40 crc kubenswrapper[4762]: W0217 18:02:40.694403 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9a34938_3950_4fa5_a14d_30feb52b752e.slice/crio-dbaeb97742f6a67970eac547470e86ab1c2ce129a35d4f403adc781e9dd21a66 WatchSource:0}: Error finding container dbaeb97742f6a67970eac547470e86ab1c2ce129a35d4f403adc781e9dd21a66: Status 404 returned error can't find the container with id dbaeb97742f6a67970eac547470e86ab1c2ce129a35d4f403adc781e9dd21a66 Feb 17 18:02:40 crc kubenswrapper[4762]: I0217 18:02:40.901348 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"d9a34938-3950-4fa5-a14d-30feb52b752e","Type":"ContainerStarted","Data":"dbaeb97742f6a67970eac547470e86ab1c2ce129a35d4f403adc781e9dd21a66"} Feb 17 18:02:46 crc kubenswrapper[4762]: I0217 18:02:46.809932 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:46 crc kubenswrapper[4762]: I0217 18:02:46.810344 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:46 crc kubenswrapper[4762]: I0217 18:02:46.842391 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:46 crc kubenswrapper[4762]: I0217 18:02:46.984945 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-j2hm8" Feb 17 18:02:48 crc kubenswrapper[4762]: I0217 18:02:48.962270 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"d9a34938-3950-4fa5-a14d-30feb52b752e","Type":"ContainerStarted","Data":"eb6d0cbe159491ec6c6dabf4a014512c4285f2dd4e5a4580516765f04f0841f7"} Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.333406 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj"] Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.335832 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.339426 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.345154 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj"] Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.460120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.460197 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.460259 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9r8v\" (UniqueName: \"kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.561063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.561143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.561198 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9r8v\" (UniqueName: \"kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.561678 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.561757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.585587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9r8v\" (UniqueName: \"kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:50 crc kubenswrapper[4762]: I0217 18:02:50.689689 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:51 crc kubenswrapper[4762]: I0217 18:02:51.110596 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj"] Feb 17 18:02:51 crc kubenswrapper[4762]: I0217 18:02:51.985324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" event={"ID":"88f65670-f91f-492b-bd41-c266624e0664","Type":"ContainerStarted","Data":"e677be9309393ea0717bcef1e61c39409e6fc931b5e218b2053f730d1f062e72"} Feb 17 18:02:54 crc kubenswrapper[4762]: I0217 18:02:54.000866 4762 generic.go:334] "Generic (PLEG): container finished" podID="88f65670-f91f-492b-bd41-c266624e0664" containerID="042c183709a00434bad972d33f62701f56a01d7ce2f1bb41150f01159de331fb" exitCode=0 Feb 17 18:02:54 crc kubenswrapper[4762]: I0217 18:02:54.000930 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" event={"ID":"88f65670-f91f-492b-bd41-c266624e0664","Type":"ContainerDied","Data":"042c183709a00434bad972d33f62701f56a01d7ce2f1bb41150f01159de331fb"} Feb 17 18:02:55 crc kubenswrapper[4762]: I0217 18:02:55.010848 4762 generic.go:334] "Generic (PLEG): container finished" podID="88f65670-f91f-492b-bd41-c266624e0664" containerID="c2208e495dd09e759e0a13497f388206081b05f1b0bbb58340c264ef2db155a4" exitCode=0 Feb 17 18:02:55 crc kubenswrapper[4762]: I0217 18:02:55.010928 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" event={"ID":"88f65670-f91f-492b-bd41-c266624e0664","Type":"ContainerDied","Data":"c2208e495dd09e759e0a13497f388206081b05f1b0bbb58340c264ef2db155a4"} Feb 17 18:02:56 crc kubenswrapper[4762]: I0217 18:02:56.020045 4762 generic.go:334] "Generic (PLEG): container finished" podID="88f65670-f91f-492b-bd41-c266624e0664" containerID="ea8ed804d7b6281700d6316c3a1c4f60eb31a9207efd9b9a02e85b9d2538e768" exitCode=0 Feb 17 18:02:56 crc kubenswrapper[4762]: I0217 18:02:56.020101 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" event={"ID":"88f65670-f91f-492b-bd41-c266624e0664","Type":"ContainerDied","Data":"ea8ed804d7b6281700d6316c3a1c4f60eb31a9207efd9b9a02e85b9d2538e768"} Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.332618 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.477939 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util\") pod \"88f65670-f91f-492b-bd41-c266624e0664\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.478279 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle\") pod \"88f65670-f91f-492b-bd41-c266624e0664\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.478437 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9r8v\" (UniqueName: \"kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v\") pod \"88f65670-f91f-492b-bd41-c266624e0664\" (UID: \"88f65670-f91f-492b-bd41-c266624e0664\") " Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.479114 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle" (OuterVolumeSpecName: "bundle") pod "88f65670-f91f-492b-bd41-c266624e0664" (UID: "88f65670-f91f-492b-bd41-c266624e0664"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.491028 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v" (OuterVolumeSpecName: "kube-api-access-g9r8v") pod "88f65670-f91f-492b-bd41-c266624e0664" (UID: "88f65670-f91f-492b-bd41-c266624e0664"). InnerVolumeSpecName "kube-api-access-g9r8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.494459 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util" (OuterVolumeSpecName: "util") pod "88f65670-f91f-492b-bd41-c266624e0664" (UID: "88f65670-f91f-492b-bd41-c266624e0664"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.579671 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9r8v\" (UniqueName: \"kubernetes.io/projected/88f65670-f91f-492b-bd41-c266624e0664-kube-api-access-g9r8v\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.579725 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:57 crc kubenswrapper[4762]: I0217 18:02:57.579739 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f65670-f91f-492b-bd41-c266624e0664-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:58 crc kubenswrapper[4762]: I0217 18:02:58.033324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" event={"ID":"88f65670-f91f-492b-bd41-c266624e0664","Type":"ContainerDied","Data":"e677be9309393ea0717bcef1e61c39409e6fc931b5e218b2053f730d1f062e72"} Feb 17 18:02:58 crc kubenswrapper[4762]: I0217 18:02:58.033568 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e677be9309393ea0717bcef1e61c39409e6fc931b5e218b2053f730d1f062e72" Feb 17 18:02:58 crc kubenswrapper[4762]: I0217 18:02:58.033381 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj" Feb 17 18:03:04 crc kubenswrapper[4762]: I0217 18:03:04.558280 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:03:04 crc kubenswrapper[4762]: I0217 18:03:04.558681 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.300001 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn"] Feb 17 18:03:08 crc kubenswrapper[4762]: E0217 18:03:08.300554 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="pull" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.300570 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="pull" Feb 17 18:03:08 crc kubenswrapper[4762]: E0217 18:03:08.300582 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="extract" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.300590 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="extract" Feb 17 18:03:08 crc kubenswrapper[4762]: E0217 18:03:08.300611 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="util" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.300641 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="util" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.300798 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f65670-f91f-492b-bd41-c266624e0664" containerName="extract" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.301289 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.303555 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.304266 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-8rwvh" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.313320 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn"] Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.421231 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-apiservice-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.421313 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-webhook-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.421368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knz7\" (UniqueName: \"kubernetes.io/projected/36598dd3-5ec9-43b7-9752-85fff598e285-kube-api-access-8knz7\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.522260 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8knz7\" (UniqueName: \"kubernetes.io/projected/36598dd3-5ec9-43b7-9752-85fff598e285-kube-api-access-8knz7\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.522390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-apiservice-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.522437 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-webhook-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.529062 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-apiservice-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.536353 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36598dd3-5ec9-43b7-9752-85fff598e285-webhook-cert\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.538129 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8knz7\" (UniqueName: \"kubernetes.io/projected/36598dd3-5ec9-43b7-9752-85fff598e285-kube-api-access-8knz7\") pod \"keystone-operator-controller-manager-74688bd7c7-pzbvn\" (UID: \"36598dd3-5ec9-43b7-9752-85fff598e285\") " pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.620903 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:08 crc kubenswrapper[4762]: I0217 18:03:08.852356 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn"] Feb 17 18:03:08 crc kubenswrapper[4762]: W0217 18:03:08.863151 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36598dd3_5ec9_43b7_9752_85fff598e285.slice/crio-856bc79ca7908be4cda8d44f4aff4db547b6fddebbeeb7555739184aaa6c8a7b WatchSource:0}: Error finding container 856bc79ca7908be4cda8d44f4aff4db547b6fddebbeeb7555739184aaa6c8a7b: Status 404 returned error can't find the container with id 856bc79ca7908be4cda8d44f4aff4db547b6fddebbeeb7555739184aaa6c8a7b Feb 17 18:03:09 crc kubenswrapper[4762]: I0217 18:03:09.104646 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" event={"ID":"36598dd3-5ec9-43b7-9752-85fff598e285","Type":"ContainerStarted","Data":"856bc79ca7908be4cda8d44f4aff4db547b6fddebbeeb7555739184aaa6c8a7b"} Feb 17 18:03:13 crc kubenswrapper[4762]: I0217 18:03:13.132355 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" event={"ID":"36598dd3-5ec9-43b7-9752-85fff598e285","Type":"ContainerStarted","Data":"5b4870062babe8d008016e90719d230eccbf2db2d26f8715c4855626745c00ab"} Feb 17 18:03:13 crc kubenswrapper[4762]: I0217 18:03:13.133016 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:13 crc kubenswrapper[4762]: I0217 18:03:13.152904 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" podStartSLOduration=1.240207364 podStartE2EDuration="5.152885817s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="2026-02-17 18:03:08.866746269 +0000 UTC m=+940.511664279" lastFinishedPulling="2026-02-17 18:03:12.779424722 +0000 UTC m=+944.424342732" observedRunningTime="2026-02-17 18:03:13.148878185 +0000 UTC m=+944.793796195" watchObservedRunningTime="2026-02-17 18:03:13.152885817 +0000 UTC m=+944.797803827" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.495718 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.497373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.512458 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.540202 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7k9h\" (UniqueName: \"kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.540324 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.540364 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.641823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7k9h\" (UniqueName: \"kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.641947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.641990 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.642451 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.642460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.690726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7k9h\" (UniqueName: \"kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h\") pod \"certified-operators-ck6sq\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:16 crc kubenswrapper[4762]: I0217 18:03:16.817940 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:17 crc kubenswrapper[4762]: I0217 18:03:17.225293 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:18 crc kubenswrapper[4762]: I0217 18:03:18.162456 4762 generic.go:334] "Generic (PLEG): container finished" podID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerID="254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c" exitCode=0 Feb 17 18:03:18 crc kubenswrapper[4762]: I0217 18:03:18.162525 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerDied","Data":"254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c"} Feb 17 18:03:18 crc kubenswrapper[4762]: I0217 18:03:18.162552 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerStarted","Data":"3209187f64d3a4004f72b13c3152b8c3478e0586c59b7bf165cf658eb944a058"} Feb 17 18:03:18 crc kubenswrapper[4762]: I0217 18:03:18.625692 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-74688bd7c7-pzbvn" Feb 17 18:03:19 crc kubenswrapper[4762]: I0217 18:03:19.170340 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerStarted","Data":"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d"} Feb 17 18:03:20 crc kubenswrapper[4762]: I0217 18:03:20.178236 4762 generic.go:334] "Generic (PLEG): container finished" podID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerID="cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d" exitCode=0 Feb 17 18:03:20 crc kubenswrapper[4762]: I0217 18:03:20.178360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerDied","Data":"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d"} Feb 17 18:03:20 crc kubenswrapper[4762]: I0217 18:03:20.180957 4762 generic.go:334] "Generic (PLEG): container finished" podID="d9a34938-3950-4fa5-a14d-30feb52b752e" containerID="eb6d0cbe159491ec6c6dabf4a014512c4285f2dd4e5a4580516765f04f0841f7" exitCode=0 Feb 17 18:03:20 crc kubenswrapper[4762]: I0217 18:03:20.181014 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"d9a34938-3950-4fa5-a14d-30feb52b752e","Type":"ContainerDied","Data":"eb6d0cbe159491ec6c6dabf4a014512c4285f2dd4e5a4580516765f04f0841f7"} Feb 17 18:03:21 crc kubenswrapper[4762]: I0217 18:03:21.190236 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerStarted","Data":"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e"} Feb 17 18:03:21 crc kubenswrapper[4762]: I0217 18:03:21.192322 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"d9a34938-3950-4fa5-a14d-30feb52b752e","Type":"ContainerStarted","Data":"35e912d21ce71bfac87f9c6572cb111533da0fe16f78f21aca9f0f114132bb1e"} Feb 17 18:03:21 crc kubenswrapper[4762]: I0217 18:03:21.192515 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:03:21 crc kubenswrapper[4762]: I0217 18:03:21.215728 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ck6sq" podStartSLOduration=2.781081468 podStartE2EDuration="5.21570271s" podCreationTimestamp="2026-02-17 18:03:16 +0000 UTC" firstStartedPulling="2026-02-17 18:03:18.164208318 +0000 UTC m=+949.809126328" lastFinishedPulling="2026-02-17 18:03:20.59882956 +0000 UTC m=+952.243747570" observedRunningTime="2026-02-17 18:03:21.212850161 +0000 UTC m=+952.857768181" watchObservedRunningTime="2026-02-17 18:03:21.21570271 +0000 UTC m=+952.860620720" Feb 17 18:03:21 crc kubenswrapper[4762]: I0217 18:03:21.238333 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=35.412386645 podStartE2EDuration="42.238315661s" podCreationTimestamp="2026-02-17 18:02:39 +0000 UTC" firstStartedPulling="2026-02-17 18:02:40.697945856 +0000 UTC m=+912.342863876" lastFinishedPulling="2026-02-17 18:02:47.523874882 +0000 UTC m=+919.168792892" observedRunningTime="2026-02-17 18:03:21.237802697 +0000 UTC m=+952.882720717" watchObservedRunningTime="2026-02-17 18:03:21.238315661 +0000 UTC m=+952.883233671" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.788244 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-g8r6d"] Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.789530 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.796192 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-20e3-account-create-update-7j87t"] Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.797366 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.799894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-g8r6d"] Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.800655 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.811586 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-20e3-account-create-update-7j87t"] Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.862896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8t9z\" (UniqueName: \"kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.862975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.863172 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prnz\" (UniqueName: \"kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.863327 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.964828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prnz\" (UniqueName: \"kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.964913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.964965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8t9z\" (UniqueName: \"kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.965010 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.965931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.965968 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:23 crc kubenswrapper[4762]: I0217 18:03:23.990964 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prnz\" (UniqueName: \"kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz\") pod \"keystone-20e3-account-create-update-7j87t\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:24 crc kubenswrapper[4762]: I0217 18:03:24.000911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8t9z\" (UniqueName: \"kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z\") pod \"keystone-db-create-g8r6d\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:24 crc kubenswrapper[4762]: I0217 18:03:24.111209 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:24 crc kubenswrapper[4762]: I0217 18:03:24.122061 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:24 crc kubenswrapper[4762]: I0217 18:03:24.549469 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-g8r6d"] Feb 17 18:03:24 crc kubenswrapper[4762]: W0217 18:03:24.553107 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bdeb25_310e_4aaf_8998_a5b7188cb179.slice/crio-f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253 WatchSource:0}: Error finding container f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253: Status 404 returned error can't find the container with id f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253 Feb 17 18:03:24 crc kubenswrapper[4762]: I0217 18:03:24.623348 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-20e3-account-create-update-7j87t"] Feb 17 18:03:24 crc kubenswrapper[4762]: W0217 18:03:24.624855 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ff6d53_9898_4d90_92aa_693f03bf528a.slice/crio-7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af WatchSource:0}: Error finding container 7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af: Status 404 returned error can't find the container with id 7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.217757 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-g8r6d" event={"ID":"f5bdeb25-310e-4aaf-8998-a5b7188cb179","Type":"ContainerStarted","Data":"7d80608f4a85df1912eda533b8a4ab3de2a71d0d1d0f9d315c6a22f95e94d866"} Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.217809 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-g8r6d" event={"ID":"f5bdeb25-310e-4aaf-8998-a5b7188cb179","Type":"ContainerStarted","Data":"f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253"} Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.220332 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" event={"ID":"25ff6d53-9898-4d90-92aa-693f03bf528a","Type":"ContainerStarted","Data":"525eb9d6d7c9ffab15788d1267ed86abb27f2342ac89bf655ce220be8de5bbfc"} Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.220457 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" event={"ID":"25ff6d53-9898-4d90-92aa-693f03bf528a","Type":"ContainerStarted","Data":"7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af"} Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.232713 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-create-g8r6d" podStartSLOduration=2.232695965 podStartE2EDuration="2.232695965s" podCreationTimestamp="2026-02-17 18:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:25.231150221 +0000 UTC m=+956.876068231" watchObservedRunningTime="2026-02-17 18:03:25.232695965 +0000 UTC m=+956.877613975" Feb 17 18:03:25 crc kubenswrapper[4762]: I0217 18:03:25.259930 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" podStartSLOduration=2.259910714 podStartE2EDuration="2.259910714s" podCreationTimestamp="2026-02-17 18:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:25.254351539 +0000 UTC m=+956.899269549" watchObservedRunningTime="2026-02-17 18:03:25.259910714 +0000 UTC m=+956.904828724" Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.228648 4762 generic.go:334] "Generic (PLEG): container finished" podID="f5bdeb25-310e-4aaf-8998-a5b7188cb179" containerID="7d80608f4a85df1912eda533b8a4ab3de2a71d0d1d0f9d315c6a22f95e94d866" exitCode=0 Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.228880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-g8r6d" event={"ID":"f5bdeb25-310e-4aaf-8998-a5b7188cb179","Type":"ContainerDied","Data":"7d80608f4a85df1912eda533b8a4ab3de2a71d0d1d0f9d315c6a22f95e94d866"} Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.231367 4762 generic.go:334] "Generic (PLEG): container finished" podID="25ff6d53-9898-4d90-92aa-693f03bf528a" containerID="525eb9d6d7c9ffab15788d1267ed86abb27f2342ac89bf655ce220be8de5bbfc" exitCode=0 Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.231403 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" event={"ID":"25ff6d53-9898-4d90-92aa-693f03bf528a","Type":"ContainerDied","Data":"525eb9d6d7c9ffab15788d1267ed86abb27f2342ac89bf655ce220be8de5bbfc"} Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.819019 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.819412 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:26 crc kubenswrapper[4762]: I0217 18:03:26.865678 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.293956 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.499853 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-rtqff"] Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.501175 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.507455 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-2xlps" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.507760 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-rtqff"] Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.594941 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.615236 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wj5\" (UniqueName: \"kubernetes.io/projected/0a83620f-b2f0-4ad8-b821-382533a09fc7-kube-api-access-c5wj5\") pod \"horizon-operator-index-rtqff\" (UID: \"0a83620f-b2f0-4ad8-b821-382533a09fc7\") " pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.636799 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.716323 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8t9z\" (UniqueName: \"kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z\") pod \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.716473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts\") pod \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\" (UID: \"f5bdeb25-310e-4aaf-8998-a5b7188cb179\") " Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.716729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wj5\" (UniqueName: \"kubernetes.io/projected/0a83620f-b2f0-4ad8-b821-382533a09fc7-kube-api-access-c5wj5\") pod \"horizon-operator-index-rtqff\" (UID: \"0a83620f-b2f0-4ad8-b821-382533a09fc7\") " pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.717542 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5bdeb25-310e-4aaf-8998-a5b7188cb179" (UID: "f5bdeb25-310e-4aaf-8998-a5b7188cb179"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.731843 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z" (OuterVolumeSpecName: "kube-api-access-s8t9z") pod "f5bdeb25-310e-4aaf-8998-a5b7188cb179" (UID: "f5bdeb25-310e-4aaf-8998-a5b7188cb179"). InnerVolumeSpecName "kube-api-access-s8t9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.732688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wj5\" (UniqueName: \"kubernetes.io/projected/0a83620f-b2f0-4ad8-b821-382533a09fc7-kube-api-access-c5wj5\") pod \"horizon-operator-index-rtqff\" (UID: \"0a83620f-b2f0-4ad8-b821-382533a09fc7\") " pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.818160 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prnz\" (UniqueName: \"kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz\") pod \"25ff6d53-9898-4d90-92aa-693f03bf528a\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.818221 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts\") pod \"25ff6d53-9898-4d90-92aa-693f03bf528a\" (UID: \"25ff6d53-9898-4d90-92aa-693f03bf528a\") " Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.818449 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8t9z\" (UniqueName: \"kubernetes.io/projected/f5bdeb25-310e-4aaf-8998-a5b7188cb179-kube-api-access-s8t9z\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.818461 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bdeb25-310e-4aaf-8998-a5b7188cb179-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.818852 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25ff6d53-9898-4d90-92aa-693f03bf528a" (UID: "25ff6d53-9898-4d90-92aa-693f03bf528a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.819223 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.821193 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz" (OuterVolumeSpecName: "kube-api-access-7prnz") pod "25ff6d53-9898-4d90-92aa-693f03bf528a" (UID: "25ff6d53-9898-4d90-92aa-693f03bf528a"). InnerVolumeSpecName "kube-api-access-7prnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.919939 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ff6d53-9898-4d90-92aa-693f03bf528a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:27 crc kubenswrapper[4762]: I0217 18:03:27.919973 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prnz\" (UniqueName: \"kubernetes.io/projected/25ff6d53-9898-4d90-92aa-693f03bf528a-kube-api-access-7prnz\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.256103 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-rtqff"] Feb 17 18:03:28 crc kubenswrapper[4762]: W0217 18:03:28.257221 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a83620f_b2f0_4ad8_b821_382533a09fc7.slice/crio-7937f7c744e7e7fbcc37c58cf90c0cb8f8358ee725b5d955410be848980fbd56 WatchSource:0}: Error finding container 7937f7c744e7e7fbcc37c58cf90c0cb8f8358ee725b5d955410be848980fbd56: Status 404 returned error can't find the container with id 7937f7c744e7e7fbcc37c58cf90c0cb8f8358ee725b5d955410be848980fbd56 Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.269687 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-g8r6d" event={"ID":"f5bdeb25-310e-4aaf-8998-a5b7188cb179","Type":"ContainerDied","Data":"f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253"} Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.269733 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-g8r6d" Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.269762 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f827ad2045f97b956589bc41f9594b5fe02561a4162e33c9bafcf406ced50253" Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.271731 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.271736 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-20e3-account-create-update-7j87t" event={"ID":"25ff6d53-9898-4d90-92aa-693f03bf528a","Type":"ContainerDied","Data":"7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af"} Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.271778 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bc5672333c9b7223bceeeeb08a7de05608c6c60ace3816fbf5a2b61933441af" Feb 17 18:03:28 crc kubenswrapper[4762]: I0217 18:03:28.687705 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:29 crc kubenswrapper[4762]: I0217 18:03:29.281522 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-rtqff" event={"ID":"0a83620f-b2f0-4ad8-b821-382533a09fc7","Type":"ContainerStarted","Data":"7269c782266d323a02ae294bf6c2e92ed1167a21b9d6597b56fde298bfa7b5f1"} Feb 17 18:03:29 crc kubenswrapper[4762]: I0217 18:03:29.281850 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-rtqff" event={"ID":"0a83620f-b2f0-4ad8-b821-382533a09fc7","Type":"ContainerStarted","Data":"7937f7c744e7e7fbcc37c58cf90c0cb8f8358ee725b5d955410be848980fbd56"} Feb 17 18:03:29 crc kubenswrapper[4762]: I0217 18:03:29.295039 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-rtqff" podStartSLOduration=1.425354507 podStartE2EDuration="2.295018794s" podCreationTimestamp="2026-02-17 18:03:27 +0000 UTC" firstStartedPulling="2026-02-17 18:03:28.263869299 +0000 UTC m=+959.908787329" lastFinishedPulling="2026-02-17 18:03:29.133533616 +0000 UTC m=+960.778451616" observedRunningTime="2026-02-17 18:03:29.293561293 +0000 UTC m=+960.938479303" watchObservedRunningTime="2026-02-17 18:03:29.295018794 +0000 UTC m=+960.939936804" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.286506 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ck6sq" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="registry-server" containerID="cri-o://839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e" gracePeriod=2 Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.446581 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.727598 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.858741 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7k9h\" (UniqueName: \"kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h\") pod \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.858822 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities\") pod \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.858910 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content\") pod \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\" (UID: \"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef\") " Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.859982 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities" (OuterVolumeSpecName: "utilities") pod "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" (UID: "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.866515 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h" (OuterVolumeSpecName: "kube-api-access-l7k9h") pod "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" (UID: "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef"). InnerVolumeSpecName "kube-api-access-l7k9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.919528 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" (UID: "e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.959970 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7k9h\" (UniqueName: \"kubernetes.io/projected/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-kube-api-access-l7k9h\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.960002 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.960011 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.992655 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-6zgfp"] Feb 17 18:03:30 crc kubenswrapper[4762]: E0217 18:03:30.993183 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="extract-utilities" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.993275 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="extract-utilities" Feb 17 18:03:30 crc kubenswrapper[4762]: E0217 18:03:30.993350 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ff6d53-9898-4d90-92aa-693f03bf528a" containerName="mariadb-account-create-update" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.993411 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ff6d53-9898-4d90-92aa-693f03bf528a" containerName="mariadb-account-create-update" Feb 17 18:03:30 crc kubenswrapper[4762]: E0217 18:03:30.993473 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="extract-content" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.993560 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="extract-content" Feb 17 18:03:30 crc kubenswrapper[4762]: E0217 18:03:30.993634 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="registry-server" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.993704 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="registry-server" Feb 17 18:03:30 crc kubenswrapper[4762]: E0217 18:03:30.993775 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5bdeb25-310e-4aaf-8998-a5b7188cb179" containerName="mariadb-database-create" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.993848 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5bdeb25-310e-4aaf-8998-a5b7188cb179" containerName="mariadb-database-create" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.994056 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ff6d53-9898-4d90-92aa-693f03bf528a" containerName="mariadb-account-create-update" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.994147 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5bdeb25-310e-4aaf-8998-a5b7188cb179" containerName="mariadb-database-create" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.994222 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerName="registry-server" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.994839 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:30 crc kubenswrapper[4762]: I0217 18:03:30.997557 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.010486 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.010703 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.011024 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gftmg" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.022309 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-6zgfp"] Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.162295 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7p2\" (UniqueName: \"kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.162389 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.263866 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7p2\" (UniqueName: \"kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.263948 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.267819 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.282953 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7p2\" (UniqueName: \"kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2\") pod \"keystone-db-sync-6zgfp\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.295357 4762 generic.go:334] "Generic (PLEG): container finished" podID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" containerID="839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e" exitCode=0 Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.295411 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerDied","Data":"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e"} Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.295456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck6sq" event={"ID":"e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef","Type":"ContainerDied","Data":"3209187f64d3a4004f72b13c3152b8c3478e0586c59b7bf165cf658eb944a058"} Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.295460 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck6sq" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.295481 4762 scope.go:117] "RemoveContainer" containerID="839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.312856 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.318132 4762 scope.go:117] "RemoveContainer" containerID="cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.318518 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.330685 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ck6sq"] Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.348420 4762 scope.go:117] "RemoveContainer" containerID="254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.364396 4762 scope.go:117] "RemoveContainer" containerID="839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e" Feb 17 18:03:31 crc kubenswrapper[4762]: E0217 18:03:31.366804 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e\": container with ID starting with 839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e not found: ID does not exist" containerID="839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.366871 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e"} err="failed to get container status \"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e\": rpc error: code = NotFound desc = could not find container \"839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e\": container with ID starting with 839493b9e1846d48c797336ee01c534c41c8589e95b45d2354a750a5f13da52e not found: ID does not exist" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.366903 4762 scope.go:117] "RemoveContainer" containerID="cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d" Feb 17 18:03:31 crc kubenswrapper[4762]: E0217 18:03:31.367207 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d\": container with ID starting with cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d not found: ID does not exist" containerID="cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.367235 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d"} err="failed to get container status \"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d\": rpc error: code = NotFound desc = could not find container \"cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d\": container with ID starting with cf230fe531f6bd6f81731d8fc50c3f0cce960731383a62432fa57be83e3d430d not found: ID does not exist" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.367254 4762 scope.go:117] "RemoveContainer" containerID="254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c" Feb 17 18:03:31 crc kubenswrapper[4762]: E0217 18:03:31.367689 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c\": container with ID starting with 254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c not found: ID does not exist" containerID="254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.368131 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c"} err="failed to get container status \"254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c\": rpc error: code = NotFound desc = could not find container \"254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c\": container with ID starting with 254f51e6ca9a166f99ff06baa85fb51a303b0a29a5e13e7dbe3768af75c1860c not found: ID does not exist" Feb 17 18:03:31 crc kubenswrapper[4762]: I0217 18:03:31.750525 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-6zgfp"] Feb 17 18:03:31 crc kubenswrapper[4762]: W0217 18:03:31.757968 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1af73f3_931c_4417_ab51_c2888ae6a593.slice/crio-20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43 WatchSource:0}: Error finding container 20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43: Status 404 returned error can't find the container with id 20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43 Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.293392 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-g9bhw"] Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.295103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.298474 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-przsc" Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.302647 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-g9bhw"] Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.307029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" event={"ID":"d1af73f3-931c-4417-ab51-c2888ae6a593","Type":"ContainerStarted","Data":"20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43"} Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.377275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch5zj\" (UniqueName: \"kubernetes.io/projected/46ed6271-2100-4c3b-a832-062d50f2311d-kube-api-access-ch5zj\") pod \"swift-operator-index-g9bhw\" (UID: \"46ed6271-2100-4c3b-a832-062d50f2311d\") " pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.479307 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch5zj\" (UniqueName: \"kubernetes.io/projected/46ed6271-2100-4c3b-a832-062d50f2311d-kube-api-access-ch5zj\") pod \"swift-operator-index-g9bhw\" (UID: \"46ed6271-2100-4c3b-a832-062d50f2311d\") " pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.500170 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch5zj\" (UniqueName: \"kubernetes.io/projected/46ed6271-2100-4c3b-a832-062d50f2311d-kube-api-access-ch5zj\") pod \"swift-operator-index-g9bhw\" (UID: \"46ed6271-2100-4c3b-a832-062d50f2311d\") " pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:32 crc kubenswrapper[4762]: I0217 18:03:32.620800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:33 crc kubenswrapper[4762]: I0217 18:03:33.044807 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef" path="/var/lib/kubelet/pods/e5c045fe-4c42-4a1d-8ffe-b1d6fc4a60ef/volumes" Feb 17 18:03:33 crc kubenswrapper[4762]: I0217 18:03:33.063376 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-g9bhw"] Feb 17 18:03:33 crc kubenswrapper[4762]: W0217 18:03:33.071175 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ed6271_2100_4c3b_a832_062d50f2311d.slice/crio-3214584adecf7ed1f7da95ab7e50b1e562542e0ccdd12f9bd7416437e150b8ae WatchSource:0}: Error finding container 3214584adecf7ed1f7da95ab7e50b1e562542e0ccdd12f9bd7416437e150b8ae: Status 404 returned error can't find the container with id 3214584adecf7ed1f7da95ab7e50b1e562542e0ccdd12f9bd7416437e150b8ae Feb 17 18:03:33 crc kubenswrapper[4762]: I0217 18:03:33.316339 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-g9bhw" event={"ID":"46ed6271-2100-4c3b-a832-062d50f2311d","Type":"ContainerStarted","Data":"3214584adecf7ed1f7da95ab7e50b1e562542e0ccdd12f9bd7416437e150b8ae"} Feb 17 18:03:34 crc kubenswrapper[4762]: I0217 18:03:34.558744 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:03:34 crc kubenswrapper[4762]: I0217 18:03:34.558853 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:03:35 crc kubenswrapper[4762]: I0217 18:03:35.332678 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-g9bhw" event={"ID":"46ed6271-2100-4c3b-a832-062d50f2311d","Type":"ContainerStarted","Data":"c6f94754acc1da7f696f90a41ca7de0fc5750964b457cf6821d2aeaa80456e11"} Feb 17 18:03:35 crc kubenswrapper[4762]: I0217 18:03:35.356590 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-g9bhw" podStartSLOduration=1.970022446 podStartE2EDuration="3.356572473s" podCreationTimestamp="2026-02-17 18:03:32 +0000 UTC" firstStartedPulling="2026-02-17 18:03:33.07313194 +0000 UTC m=+964.718049940" lastFinishedPulling="2026-02-17 18:03:34.459681957 +0000 UTC m=+966.104599967" observedRunningTime="2026-02-17 18:03:35.351775489 +0000 UTC m=+966.996693499" watchObservedRunningTime="2026-02-17 18:03:35.356572473 +0000 UTC m=+967.001490483" Feb 17 18:03:37 crc kubenswrapper[4762]: I0217 18:03:37.820024 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:37 crc kubenswrapper[4762]: I0217 18:03:37.820715 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:37 crc kubenswrapper[4762]: I0217 18:03:37.845004 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:38 crc kubenswrapper[4762]: I0217 18:03:38.394714 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-rtqff" Feb 17 18:03:40 crc kubenswrapper[4762]: I0217 18:03:40.377025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" event={"ID":"d1af73f3-931c-4417-ab51-c2888ae6a593","Type":"ContainerStarted","Data":"7b6b45ba53310f8445d55cfc438f70edcc9eb4466c5151f9ce1cec45780b1ee2"} Feb 17 18:03:40 crc kubenswrapper[4762]: I0217 18:03:40.392850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" podStartSLOduration=2.371995517 podStartE2EDuration="10.39282681s" podCreationTimestamp="2026-02-17 18:03:30 +0000 UTC" firstStartedPulling="2026-02-17 18:03:31.760651332 +0000 UTC m=+963.405569342" lastFinishedPulling="2026-02-17 18:03:39.781482625 +0000 UTC m=+971.426400635" observedRunningTime="2026-02-17 18:03:40.390589318 +0000 UTC m=+972.035507338" watchObservedRunningTime="2026-02-17 18:03:40.39282681 +0000 UTC m=+972.037744820" Feb 17 18:03:42 crc kubenswrapper[4762]: I0217 18:03:42.621304 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:42 crc kubenswrapper[4762]: I0217 18:03:42.621692 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:42 crc kubenswrapper[4762]: I0217 18:03:42.647604 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:43 crc kubenswrapper[4762]: I0217 18:03:43.394429 4762 generic.go:334] "Generic (PLEG): container finished" podID="d1af73f3-931c-4417-ab51-c2888ae6a593" containerID="7b6b45ba53310f8445d55cfc438f70edcc9eb4466c5151f9ce1cec45780b1ee2" exitCode=0 Feb 17 18:03:43 crc kubenswrapper[4762]: I0217 18:03:43.394614 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" event={"ID":"d1af73f3-931c-4417-ab51-c2888ae6a593","Type":"ContainerDied","Data":"7b6b45ba53310f8445d55cfc438f70edcc9eb4466c5151f9ce1cec45780b1ee2"} Feb 17 18:03:43 crc kubenswrapper[4762]: I0217 18:03:43.420786 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-g9bhw" Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.770371 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.852814 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x7p2\" (UniqueName: \"kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2\") pod \"d1af73f3-931c-4417-ab51-c2888ae6a593\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.852973 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data\") pod \"d1af73f3-931c-4417-ab51-c2888ae6a593\" (UID: \"d1af73f3-931c-4417-ab51-c2888ae6a593\") " Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.859285 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2" (OuterVolumeSpecName: "kube-api-access-7x7p2") pod "d1af73f3-931c-4417-ab51-c2888ae6a593" (UID: "d1af73f3-931c-4417-ab51-c2888ae6a593"). InnerVolumeSpecName "kube-api-access-7x7p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.892049 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data" (OuterVolumeSpecName: "config-data") pod "d1af73f3-931c-4417-ab51-c2888ae6a593" (UID: "d1af73f3-931c-4417-ab51-c2888ae6a593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.954875 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x7p2\" (UniqueName: \"kubernetes.io/projected/d1af73f3-931c-4417-ab51-c2888ae6a593-kube-api-access-7x7p2\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:44 crc kubenswrapper[4762]: I0217 18:03:44.954915 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1af73f3-931c-4417-ab51-c2888ae6a593-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.407519 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" event={"ID":"d1af73f3-931c-4417-ab51-c2888ae6a593","Type":"ContainerDied","Data":"20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43"} Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.407564 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ea7ad8bd0b202b42cdce8a40a4175050e707e0fb7264ab802381fd6eb39b43" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.407601 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-6zgfp" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.613128 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-nz8q8"] Feb 17 18:03:45 crc kubenswrapper[4762]: E0217 18:03:45.615129 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1af73f3-931c-4417-ab51-c2888ae6a593" containerName="keystone-db-sync" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.615172 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1af73f3-931c-4417-ab51-c2888ae6a593" containerName="keystone-db-sync" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.615720 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1af73f3-931c-4417-ab51-c2888ae6a593" containerName="keystone-db-sync" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.619812 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.622777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gftmg" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.622891 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.622777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.623578 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.623721 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.633709 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-nz8q8"] Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.765158 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.765278 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mft8\" (UniqueName: \"kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.765304 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.765327 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.765344 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.867142 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.867234 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mft8\" (UniqueName: \"kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.867269 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.867295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.867310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.870854 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.871577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.871657 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.872864 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.884928 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mft8\" (UniqueName: \"kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8\") pod \"keystone-bootstrap-nz8q8\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:45 crc kubenswrapper[4762]: I0217 18:03:45.942205 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.341700 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-nz8q8"] Feb 17 18:03:46 crc kubenswrapper[4762]: W0217 18:03:46.343836 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec8a297e_a079_4043_93d2_7a5e2574003c.slice/crio-f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5 WatchSource:0}: Error finding container f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5: Status 404 returned error can't find the container with id f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5 Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.416445 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" event={"ID":"ec8a297e-a079-4043-93d2-7a5e2574003c","Type":"ContainerStarted","Data":"f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5"} Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.530470 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm"] Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.531760 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.536053 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.541348 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm"] Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.680240 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5zzt\" (UniqueName: \"kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.680332 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.680426 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.791359 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.791422 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5zzt\" (UniqueName: \"kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.791465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.791932 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.791992 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.809455 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5zzt\" (UniqueName: \"kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:46 crc kubenswrapper[4762]: I0217 18:03:46.848448 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.251630 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm"] Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.426004 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerStarted","Data":"aa79242ae2ab117c5ec9793715b102f17cc8a8ba4976c5e73822335632b075d7"} Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.426059 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerStarted","Data":"b0e2c2db977ba00a39693c87dc969cd62e3342fbd38e13dc64783f062e33a894"} Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.428252 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" event={"ID":"ec8a297e-a079-4043-93d2-7a5e2574003c","Type":"ContainerStarted","Data":"3aae3e7920896ef3ead49c930801f94439b89f8f6a52d6233e451463c5e0431a"} Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.531558 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" podStartSLOduration=2.531538227 podStartE2EDuration="2.531538227s" podCreationTimestamp="2026-02-17 18:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:47.462436198 +0000 UTC m=+979.107354218" watchObservedRunningTime="2026-02-17 18:03:47.531538227 +0000 UTC m=+979.176456237" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.535594 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs"] Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.536849 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.549869 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs"] Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.705236 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.705296 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrn5\" (UniqueName: \"kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.705332 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.807394 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.807466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrn5\" (UniqueName: \"kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.807517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.807835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.807925 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.834746 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrn5\" (UniqueName: \"kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:47 crc kubenswrapper[4762]: I0217 18:03:47.854924 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:48 crc kubenswrapper[4762]: W0217 18:03:48.263874 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f17fa55_8aa4_4ae0_9d3b_e1d3f638a6d5.slice/crio-9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3 WatchSource:0}: Error finding container 9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3: Status 404 returned error can't find the container with id 9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3 Feb 17 18:03:48 crc kubenswrapper[4762]: I0217 18:03:48.267623 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs"] Feb 17 18:03:48 crc kubenswrapper[4762]: I0217 18:03:48.437674 4762 generic.go:334] "Generic (PLEG): container finished" podID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerID="aa79242ae2ab117c5ec9793715b102f17cc8a8ba4976c5e73822335632b075d7" exitCode=0 Feb 17 18:03:48 crc kubenswrapper[4762]: I0217 18:03:48.437743 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerDied","Data":"aa79242ae2ab117c5ec9793715b102f17cc8a8ba4976c5e73822335632b075d7"} Feb 17 18:03:48 crc kubenswrapper[4762]: I0217 18:03:48.441412 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerStarted","Data":"0f1029b9404865db4ad97b983d3bc87714868feaf20329b2366b22cb9c00325f"} Feb 17 18:03:48 crc kubenswrapper[4762]: I0217 18:03:48.441438 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerStarted","Data":"9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3"} Feb 17 18:03:49 crc kubenswrapper[4762]: I0217 18:03:49.447931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerStarted","Data":"271b4fc7cf781e43440d5cc903853b35dbae503a3a31f0eec673d6da88c9aefa"} Feb 17 18:03:49 crc kubenswrapper[4762]: I0217 18:03:49.451914 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerID="0f1029b9404865db4ad97b983d3bc87714868feaf20329b2366b22cb9c00325f" exitCode=0 Feb 17 18:03:49 crc kubenswrapper[4762]: I0217 18:03:49.451977 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerDied","Data":"0f1029b9404865db4ad97b983d3bc87714868feaf20329b2366b22cb9c00325f"} Feb 17 18:03:49 crc kubenswrapper[4762]: I0217 18:03:49.453120 4762 generic.go:334] "Generic (PLEG): container finished" podID="ec8a297e-a079-4043-93d2-7a5e2574003c" containerID="3aae3e7920896ef3ead49c930801f94439b89f8f6a52d6233e451463c5e0431a" exitCode=0 Feb 17 18:03:49 crc kubenswrapper[4762]: I0217 18:03:49.453161 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" event={"ID":"ec8a297e-a079-4043-93d2-7a5e2574003c","Type":"ContainerDied","Data":"3aae3e7920896ef3ead49c930801f94439b89f8f6a52d6233e451463c5e0431a"} Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.462320 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerID="89fcc68d1d776d9c8913fdbe4993a07992b1024f8445865b86579fcfddfd4fb9" exitCode=0 Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.462412 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerDied","Data":"89fcc68d1d776d9c8913fdbe4993a07992b1024f8445865b86579fcfddfd4fb9"} Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.466109 4762 generic.go:334] "Generic (PLEG): container finished" podID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerID="271b4fc7cf781e43440d5cc903853b35dbae503a3a31f0eec673d6da88c9aefa" exitCode=0 Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.466129 4762 generic.go:334] "Generic (PLEG): container finished" podID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerID="1f5aca89e19d3f66915958e341cd745d108dd4f59115122b87d84fd208827b05" exitCode=0 Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.466169 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerDied","Data":"271b4fc7cf781e43440d5cc903853b35dbae503a3a31f0eec673d6da88c9aefa"} Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.466250 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerDied","Data":"1f5aca89e19d3f66915958e341cd745d108dd4f59115122b87d84fd208827b05"} Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.822284 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.949926 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys\") pod \"ec8a297e-a079-4043-93d2-7a5e2574003c\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.950025 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data\") pod \"ec8a297e-a079-4043-93d2-7a5e2574003c\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.950067 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts\") pod \"ec8a297e-a079-4043-93d2-7a5e2574003c\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.950098 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys\") pod \"ec8a297e-a079-4043-93d2-7a5e2574003c\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.950124 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mft8\" (UniqueName: \"kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8\") pod \"ec8a297e-a079-4043-93d2-7a5e2574003c\" (UID: \"ec8a297e-a079-4043-93d2-7a5e2574003c\") " Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.956097 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8" (OuterVolumeSpecName: "kube-api-access-7mft8") pod "ec8a297e-a079-4043-93d2-7a5e2574003c" (UID: "ec8a297e-a079-4043-93d2-7a5e2574003c"). InnerVolumeSpecName "kube-api-access-7mft8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.956193 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts" (OuterVolumeSpecName: "scripts") pod "ec8a297e-a079-4043-93d2-7a5e2574003c" (UID: "ec8a297e-a079-4043-93d2-7a5e2574003c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.959465 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ec8a297e-a079-4043-93d2-7a5e2574003c" (UID: "ec8a297e-a079-4043-93d2-7a5e2574003c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.959790 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec8a297e-a079-4043-93d2-7a5e2574003c" (UID: "ec8a297e-a079-4043-93d2-7a5e2574003c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:50 crc kubenswrapper[4762]: I0217 18:03:50.970876 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data" (OuterVolumeSpecName: "config-data") pod "ec8a297e-a079-4043-93d2-7a5e2574003c" (UID: "ec8a297e-a079-4043-93d2-7a5e2574003c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.051320 4762 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.051360 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.051375 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.051385 4762 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec8a297e-a079-4043-93d2-7a5e2574003c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.051397 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mft8\" (UniqueName: \"kubernetes.io/projected/ec8a297e-a079-4043-93d2-7a5e2574003c-kube-api-access-7mft8\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.476309 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerID="c4664f87b551d268c49a05967ace101bf4a5fa0825bdc05a19090e179278a7f7" exitCode=0 Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.477104 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerDied","Data":"c4664f87b551d268c49a05967ace101bf4a5fa0825bdc05a19090e179278a7f7"} Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.478724 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.479823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-nz8q8" event={"ID":"ec8a297e-a079-4043-93d2-7a5e2574003c","Type":"ContainerDied","Data":"f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5"} Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.479879 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f609fec5ea4c3903c740561dca541d7148ffa03e0c66d2b48467a86e36d867e5" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.560220 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-5948fd7fc9-pkz2m"] Feb 17 18:03:51 crc kubenswrapper[4762]: E0217 18:03:51.560504 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8a297e-a079-4043-93d2-7a5e2574003c" containerName="keystone-bootstrap" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.560519 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8a297e-a079-4043-93d2-7a5e2574003c" containerName="keystone-bootstrap" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.560676 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8a297e-a079-4043-93d2-7a5e2574003c" containerName="keystone-bootstrap" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.561102 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.565074 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.565211 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.565270 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gftmg" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.565428 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.574424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5948fd7fc9-pkz2m"] Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.658689 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpvh8\" (UniqueName: \"kubernetes.io/projected/696388b8-20ed-48cc-98fa-117526c518da-kube-api-access-mpvh8\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.658743 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-fernet-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.658779 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-credential-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.658818 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-scripts\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.658841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-config-data\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.752962 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.759923 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-credential-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.760018 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-scripts\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.760053 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-config-data\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.760133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpvh8\" (UniqueName: \"kubernetes.io/projected/696388b8-20ed-48cc-98fa-117526c518da-kube-api-access-mpvh8\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.760165 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-fernet-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.766050 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-fernet-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.767378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-config-data\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.767939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-credential-keys\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.769850 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696388b8-20ed-48cc-98fa-117526c518da-scripts\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.782148 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpvh8\" (UniqueName: \"kubernetes.io/projected/696388b8-20ed-48cc-98fa-117526c518da-kube-api-access-mpvh8\") pod \"keystone-5948fd7fc9-pkz2m\" (UID: \"696388b8-20ed-48cc-98fa-117526c518da\") " pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.861077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle\") pod \"7999604c-7cbf-4bd9-9280-fb8d4d047737\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.861121 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5zzt\" (UniqueName: \"kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt\") pod \"7999604c-7cbf-4bd9-9280-fb8d4d047737\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.861158 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util\") pod \"7999604c-7cbf-4bd9-9280-fb8d4d047737\" (UID: \"7999604c-7cbf-4bd9-9280-fb8d4d047737\") " Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.861860 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle" (OuterVolumeSpecName: "bundle") pod "7999604c-7cbf-4bd9-9280-fb8d4d047737" (UID: "7999604c-7cbf-4bd9-9280-fb8d4d047737"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.865086 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt" (OuterVolumeSpecName: "kube-api-access-h5zzt") pod "7999604c-7cbf-4bd9-9280-fb8d4d047737" (UID: "7999604c-7cbf-4bd9-9280-fb8d4d047737"). InnerVolumeSpecName "kube-api-access-h5zzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.877964 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util" (OuterVolumeSpecName: "util") pod "7999604c-7cbf-4bd9-9280-fb8d4d047737" (UID: "7999604c-7cbf-4bd9-9280-fb8d4d047737"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.896789 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.962951 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.962983 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5zzt\" (UniqueName: \"kubernetes.io/projected/7999604c-7cbf-4bd9-9280-fb8d4d047737-kube-api-access-h5zzt\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:51 crc kubenswrapper[4762]: I0217 18:03:51.962999 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7999604c-7cbf-4bd9-9280-fb8d4d047737-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.285860 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5948fd7fc9-pkz2m"] Feb 17 18:03:52 crc kubenswrapper[4762]: W0217 18:03:52.291691 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696388b8_20ed_48cc_98fa_117526c518da.slice/crio-807d9f0cf77d0f741272750db4bb3f8f36c82480d7fec6041cb0a7b691b92830 WatchSource:0}: Error finding container 807d9f0cf77d0f741272750db4bb3f8f36c82480d7fec6041cb0a7b691b92830: Status 404 returned error can't find the container with id 807d9f0cf77d0f741272750db4bb3f8f36c82480d7fec6041cb0a7b691b92830 Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.486531 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" event={"ID":"696388b8-20ed-48cc-98fa-117526c518da","Type":"ContainerStarted","Data":"ed5252783b7dd65a612dd1fd5e6cc9b3dda923188b9eb937c70ed2bec113199c"} Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.486584 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" event={"ID":"696388b8-20ed-48cc-98fa-117526c518da","Type":"ContainerStarted","Data":"807d9f0cf77d0f741272750db4bb3f8f36c82480d7fec6041cb0a7b691b92830"} Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.486763 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.490033 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.490009 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm" event={"ID":"7999604c-7cbf-4bd9-9280-fb8d4d047737","Type":"ContainerDied","Data":"b0e2c2db977ba00a39693c87dc969cd62e3342fbd38e13dc64783f062e33a894"} Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.490077 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e2c2db977ba00a39693c87dc969cd62e3342fbd38e13dc64783f062e33a894" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.509509 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" podStartSLOduration=1.509459186 podStartE2EDuration="1.509459186s" podCreationTimestamp="2026-02-17 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:52.501779902 +0000 UTC m=+984.146697922" watchObservedRunningTime="2026-02-17 18:03:52.509459186 +0000 UTC m=+984.154377216" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.687249 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.771486 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util\") pod \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.771607 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle\") pod \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.771711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfrn5\" (UniqueName: \"kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5\") pod \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\" (UID: \"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5\") " Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.772708 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle" (OuterVolumeSpecName: "bundle") pod "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" (UID: "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.776800 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5" (OuterVolumeSpecName: "kube-api-access-vfrn5") pod "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" (UID: "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5"). InnerVolumeSpecName "kube-api-access-vfrn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.788473 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util" (OuterVolumeSpecName: "util") pod "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" (UID: "0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.873582 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.873616 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:52 crc kubenswrapper[4762]: I0217 18:03:52.873649 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfrn5\" (UniqueName: \"kubernetes.io/projected/0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5-kube-api-access-vfrn5\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:53 crc kubenswrapper[4762]: I0217 18:03:53.498538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" event={"ID":"0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5","Type":"ContainerDied","Data":"9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3"} Feb 17 18:03:53 crc kubenswrapper[4762]: I0217 18:03:53.498612 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c41bb839e53bf55ea8b310de8ff5a23f4ce4b925d59e42f1090df7c8fc057e3" Feb 17 18:03:53 crc kubenswrapper[4762]: I0217 18:03:53.498582 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs" Feb 17 18:04:04 crc kubenswrapper[4762]: I0217 18:04:04.558383 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:04:04 crc kubenswrapper[4762]: I0217 18:04:04.559038 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:04:04 crc kubenswrapper[4762]: I0217 18:04:04.559103 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 18:04:04 crc kubenswrapper[4762]: I0217 18:04:04.559868 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:04:04 crc kubenswrapper[4762]: I0217 18:04:04.559937 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de" gracePeriod=600 Feb 17 18:04:05 crc kubenswrapper[4762]: I0217 18:04:05.582314 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de" exitCode=0 Feb 17 18:04:05 crc kubenswrapper[4762]: I0217 18:04:05.582384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de"} Feb 17 18:04:05 crc kubenswrapper[4762]: I0217 18:04:05.582881 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6"} Feb 17 18:04:05 crc kubenswrapper[4762]: I0217 18:04:05.582905 4762 scope.go:117] "RemoveContainer" containerID="53eac13c8290dd1b353e345a7552ad443b04bbc8218394f015dea59e9defb212" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.008406 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc"] Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009309 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="util" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009326 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="util" Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009339 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="pull" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009350 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="pull" Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009364 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009372 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009382 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009389 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009401 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="util" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009409 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="util" Feb 17 18:04:10 crc kubenswrapper[4762]: E0217 18:04:10.009426 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="pull" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009434 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="pull" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009587 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7999604c-7cbf-4bd9-9280-fb8d4d047737" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.009603 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5" containerName="extract" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.010145 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.012473 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.012677 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rxclc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.022384 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc"] Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.109847 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfwx8\" (UniqueName: \"kubernetes.io/projected/9cdf848e-625b-4ac0-a1c2-60c34043a95c-kube-api-access-kfwx8\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.109909 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-apiservice-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.109982 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-webhook-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.211649 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfwx8\" (UniqueName: \"kubernetes.io/projected/9cdf848e-625b-4ac0-a1c2-60c34043a95c-kube-api-access-kfwx8\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.211707 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-apiservice-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.211759 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-webhook-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.221432 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-webhook-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.230113 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9cdf848e-625b-4ac0-a1c2-60c34043a95c-apiservice-cert\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.245313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfwx8\" (UniqueName: \"kubernetes.io/projected/9cdf848e-625b-4ac0-a1c2-60c34043a95c-kube-api-access-kfwx8\") pod \"horizon-operator-controller-manager-678dcfb94b-dlbqc\" (UID: \"9cdf848e-625b-4ac0-a1c2-60c34043a95c\") " pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.335111 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.868660 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc"] Feb 17 18:04:10 crc kubenswrapper[4762]: W0217 18:04:10.879284 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdf848e_625b_4ac0_a1c2_60c34043a95c.slice/crio-dccc1a6c715c48080a18c8b67b6630fd5ddf0d5412b00981f75257f36205a184 WatchSource:0}: Error finding container dccc1a6c715c48080a18c8b67b6630fd5ddf0d5412b00981f75257f36205a184: Status 404 returned error can't find the container with id dccc1a6c715c48080a18c8b67b6630fd5ddf0d5412b00981f75257f36205a184 Feb 17 18:04:10 crc kubenswrapper[4762]: I0217 18:04:10.882563 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:04:11 crc kubenswrapper[4762]: I0217 18:04:11.625733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" event={"ID":"9cdf848e-625b-4ac0-a1c2-60c34043a95c","Type":"ContainerStarted","Data":"dccc1a6c715c48080a18c8b67b6630fd5ddf0d5412b00981f75257f36205a184"} Feb 17 18:04:13 crc kubenswrapper[4762]: I0217 18:04:13.639401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" event={"ID":"9cdf848e-625b-4ac0-a1c2-60c34043a95c","Type":"ContainerStarted","Data":"6b119e096c5c31642de5eff6ca558f0a4b983b2929f9a99988ff4d759904cf8a"} Feb 17 18:04:13 crc kubenswrapper[4762]: I0217 18:04:13.640957 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:13 crc kubenswrapper[4762]: I0217 18:04:13.664036 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" podStartSLOduration=2.579761743 podStartE2EDuration="4.664014575s" podCreationTimestamp="2026-02-17 18:04:09 +0000 UTC" firstStartedPulling="2026-02-17 18:04:10.882345324 +0000 UTC m=+1002.527263334" lastFinishedPulling="2026-02-17 18:04:12.966598156 +0000 UTC m=+1004.611516166" observedRunningTime="2026-02-17 18:04:13.657127573 +0000 UTC m=+1005.302045593" watchObservedRunningTime="2026-02-17 18:04:13.664014575 +0000 UTC m=+1005.308932585" Feb 17 18:04:20 crc kubenswrapper[4762]: I0217 18:04:20.340014 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-678dcfb94b-dlbqc" Feb 17 18:04:21 crc kubenswrapper[4762]: I0217 18:04:21.880386 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb"] Feb 17 18:04:21 crc kubenswrapper[4762]: I0217 18:04:21.885382 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb"] Feb 17 18:04:21 crc kubenswrapper[4762]: I0217 18:04:21.885587 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:21 crc kubenswrapper[4762]: I0217 18:04:21.897441 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Feb 17 18:04:21 crc kubenswrapper[4762]: I0217 18:04:21.897761 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mz7ll" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.000042 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-apiservice-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.000091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4xjx\" (UniqueName: \"kubernetes.io/projected/d69b17f8-8fea-4129-b57c-5e67d1d0602a-kube-api-access-x4xjx\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.000172 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-webhook-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.101517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-webhook-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.103010 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-apiservice-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.103386 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4xjx\" (UniqueName: \"kubernetes.io/projected/d69b17f8-8fea-4129-b57c-5e67d1d0602a-kube-api-access-x4xjx\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.108496 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-apiservice-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.113284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d69b17f8-8fea-4129-b57c-5e67d1d0602a-webhook-cert\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.144374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4xjx\" (UniqueName: \"kubernetes.io/projected/d69b17f8-8fea-4129-b57c-5e67d1d0602a-kube-api-access-x4xjx\") pod \"swift-operator-controller-manager-5b455594df-pl8hb\" (UID: \"d69b17f8-8fea-4129-b57c-5e67d1d0602a\") " pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.215783 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.677597 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb"] Feb 17 18:04:22 crc kubenswrapper[4762]: I0217 18:04:22.698817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" event={"ID":"d69b17f8-8fea-4129-b57c-5e67d1d0602a","Type":"ContainerStarted","Data":"34b1e427f21c1667b012ffe70feaf7a1625dc77ab4a3c94e4be898d3aeca57d7"} Feb 17 18:04:23 crc kubenswrapper[4762]: I0217 18:04:23.446108 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-5948fd7fc9-pkz2m" Feb 17 18:04:25 crc kubenswrapper[4762]: I0217 18:04:25.719232 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" event={"ID":"d69b17f8-8fea-4129-b57c-5e67d1d0602a","Type":"ContainerStarted","Data":"40c0ed4f4114fd0905d89761e76579d702dcb97b5e6d8b961f862fcd97b3c368"} Feb 17 18:04:25 crc kubenswrapper[4762]: I0217 18:04:25.719558 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:25 crc kubenswrapper[4762]: I0217 18:04:25.739233 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" podStartSLOduration=2.279894562 podStartE2EDuration="4.739214034s" podCreationTimestamp="2026-02-17 18:04:21 +0000 UTC" firstStartedPulling="2026-02-17 18:04:22.684125541 +0000 UTC m=+1014.329043551" lastFinishedPulling="2026-02-17 18:04:25.143445013 +0000 UTC m=+1016.788363023" observedRunningTime="2026-02-17 18:04:25.737196127 +0000 UTC m=+1017.382114147" watchObservedRunningTime="2026-02-17 18:04:25.739214034 +0000 UTC m=+1017.384132044" Feb 17 18:04:32 crc kubenswrapper[4762]: I0217 18:04:32.221244 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5b455594df-pl8hb" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.893992 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.899217 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.902064 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.902222 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.902547 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.902765 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-xxl98" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.947602 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.955845 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-cmgbl"] Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.956764 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.961660 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.962388 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Feb 17 18:04:35 crc kubenswrapper[4762]: I0217 18:04:35.962550 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.002998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-lock\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.003055 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.003098 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.003180 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj59w\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-kube-api-access-wj59w\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.003218 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-cache\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.003980 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-cmgbl"] Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.022170 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-cmgbl"] Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.022770 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-k5f2w ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-k5f2w ring-data-devices scripts swiftconf]: context canceled" pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" podUID="716a7f31-b580-44b7-bf2b-b935a0895d87" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.105086 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.105142 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj59w\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-kube-api-access-wj59w\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.105803 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-cache\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.105818 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-cache\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106002 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106070 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106108 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-lock\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106140 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106174 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106190 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106216 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106238 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5f2w\" (UniqueName: \"kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.106310 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.106331 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.106582 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:04:36.606563818 +0000 UTC m=+1028.251481828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.106583 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.107101 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae866fa5-748d-4935-a3d2-2fe08bc9693f-lock\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.141953 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.142808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj59w\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-kube-api-access-wj59w\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207693 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.207937 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5f2w\" (UniqueName: \"kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.208276 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.208754 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.208848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.212005 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.230146 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.263335 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5f2w\" (UniqueName: \"kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w\") pod \"swift-ring-rebalance-cmgbl\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.613766 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.613995 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.614015 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: E0217 18:04:36.614096 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:04:37.614074656 +0000 UTC m=+1029.258992656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.906545 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:36 crc kubenswrapper[4762]: I0217 18:04:36.915041 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018710 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5f2w\" (UniqueName: \"kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018768 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018827 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.018917 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf\") pod \"716a7f31-b580-44b7-bf2b-b935a0895d87\" (UID: \"716a7f31-b580-44b7-bf2b-b935a0895d87\") " Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.019636 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.019658 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.019713 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts" (OuterVolumeSpecName: "scripts") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.022019 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.025371 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w" (OuterVolumeSpecName: "kube-api-access-k5f2w") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "kube-api-access-k5f2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.025494 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "716a7f31-b580-44b7-bf2b-b935a0895d87" (UID: "716a7f31-b580-44b7-bf2b-b935a0895d87"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.121562 4762 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.121809 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5f2w\" (UniqueName: \"kubernetes.io/projected/716a7f31-b580-44b7-bf2b-b935a0895d87-kube-api-access-k5f2w\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.121871 4762 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/716a7f31-b580-44b7-bf2b-b935a0895d87-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.121957 4762 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/716a7f31-b580-44b7-bf2b-b935a0895d87-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.122013 4762 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.122078 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/716a7f31-b580-44b7-bf2b-b935a0895d87-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.496719 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.497759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.502065 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-5v6m6" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.509485 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.535076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6mq\" (UniqueName: \"kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq\") pod \"glance-operator-index-kfgpn\" (UID: \"3060adf8-6d22-4370-a102-6d1cd6b0914b\") " pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.638192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.638317 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6mq\" (UniqueName: \"kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq\") pod \"glance-operator-index-kfgpn\" (UID: \"3060adf8-6d22-4370-a102-6d1cd6b0914b\") " pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:37 crc kubenswrapper[4762]: E0217 18:04:37.638548 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:37 crc kubenswrapper[4762]: E0217 18:04:37.638666 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:37 crc kubenswrapper[4762]: E0217 18:04:37.638771 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:04:39.63875203 +0000 UTC m=+1031.283670040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.657214 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6mq\" (UniqueName: \"kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq\") pod \"glance-operator-index-kfgpn\" (UID: \"3060adf8-6d22-4370-a102-6d1cd6b0914b\") " pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.912816 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-cmgbl" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.937862 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.942636 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-cmgbl"] Feb 17 18:04:37 crc kubenswrapper[4762]: I0217 18:04:37.948972 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-cmgbl"] Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.261471 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7"] Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.263146 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.293776 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.312721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7"] Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.439388 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:38 crc kubenswrapper[4762]: W0217 18:04:38.445223 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3060adf8_6d22_4370_a102_6d1cd6b0914b.slice/crio-c2012531909740ab684c3088e8e11b0abec99de64b1a92e831b9cb76663f0723 WatchSource:0}: Error finding container c2012531909740ab684c3088e8e11b0abec99de64b1a92e831b9cb76663f0723: Status 404 returned error can't find the container with id c2012531909740ab684c3088e8e11b0abec99de64b1a92e831b9cb76663f0723 Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.451400 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-log-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.451471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e576e3fe-21e1-4867-adcc-bb586e3a5921-config-data\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.451503 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.451524 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-run-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.451543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jkc\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-kube-api-access-t9jkc\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.553483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-log-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.553857 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e576e3fe-21e1-4867-adcc-bb586e3a5921-config-data\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.553904 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.553929 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-run-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.553958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jkc\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-kube-api-access-t9jkc\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.554077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-log-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: E0217 18:04:38.554161 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:38 crc kubenswrapper[4762]: E0217 18:04:38.554195 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:38 crc kubenswrapper[4762]: E0217 18:04:38.554269 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:04:39.054240866 +0000 UTC m=+1030.699158966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.554314 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e576e3fe-21e1-4867-adcc-bb586e3a5921-run-httpd\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.575904 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e576e3fe-21e1-4867-adcc-bb586e3a5921-config-data\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.576372 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jkc\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-kube-api-access-t9jkc\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:38 crc kubenswrapper[4762]: I0217 18:04:38.924445 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-kfgpn" event={"ID":"3060adf8-6d22-4370-a102-6d1cd6b0914b","Type":"ContainerStarted","Data":"c2012531909740ab684c3088e8e11b0abec99de64b1a92e831b9cb76663f0723"} Feb 17 18:04:39 crc kubenswrapper[4762]: I0217 18:04:39.046636 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716a7f31-b580-44b7-bf2b-b935a0895d87" path="/var/lib/kubelet/pods/716a7f31-b580-44b7-bf2b-b935a0895d87/volumes" Feb 17 18:04:39 crc kubenswrapper[4762]: I0217 18:04:39.060998 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.062166 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.062193 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.062279 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:04:40.062256867 +0000 UTC m=+1031.707174877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:39 crc kubenswrapper[4762]: I0217 18:04:39.669562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.669895 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.669930 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:39 crc kubenswrapper[4762]: E0217 18:04:39.670180 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:04:43.670154616 +0000 UTC m=+1035.315072626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:40 crc kubenswrapper[4762]: I0217 18:04:40.077378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:40 crc kubenswrapper[4762]: E0217 18:04:40.077533 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:40 crc kubenswrapper[4762]: E0217 18:04:40.077548 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:40 crc kubenswrapper[4762]: E0217 18:04:40.077595 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:04:42.077579229 +0000 UTC m=+1033.722497239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:41 crc kubenswrapper[4762]: I0217 18:04:41.887733 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:41 crc kubenswrapper[4762]: I0217 18:04:41.950957 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-kfgpn" event={"ID":"3060adf8-6d22-4370-a102-6d1cd6b0914b","Type":"ContainerStarted","Data":"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec"} Feb 17 18:04:41 crc kubenswrapper[4762]: I0217 18:04:41.967720 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-kfgpn" podStartSLOduration=2.324152228 podStartE2EDuration="4.967705792s" podCreationTimestamp="2026-02-17 18:04:37 +0000 UTC" firstStartedPulling="2026-02-17 18:04:38.447956529 +0000 UTC m=+1030.092874539" lastFinishedPulling="2026-02-17 18:04:41.091510093 +0000 UTC m=+1032.736428103" observedRunningTime="2026-02-17 18:04:41.966437777 +0000 UTC m=+1033.611355787" watchObservedRunningTime="2026-02-17 18:04:41.967705792 +0000 UTC m=+1033.612623802" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.105467 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:42 crc kubenswrapper[4762]: E0217 18:04:42.105698 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:42 crc kubenswrapper[4762]: E0217 18:04:42.105724 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:42 crc kubenswrapper[4762]: E0217 18:04:42.105793 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:04:46.105770397 +0000 UTC m=+1037.750688447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.294880 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-jz5wd"] Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.295759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.305053 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-jz5wd"] Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.409299 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wxz6\" (UniqueName: \"kubernetes.io/projected/6e043c44-ccec-451b-9ba3-505e49d89bce-kube-api-access-4wxz6\") pod \"glance-operator-index-jz5wd\" (UID: \"6e043c44-ccec-451b-9ba3-505e49d89bce\") " pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.510811 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wxz6\" (UniqueName: \"kubernetes.io/projected/6e043c44-ccec-451b-9ba3-505e49d89bce-kube-api-access-4wxz6\") pod \"glance-operator-index-jz5wd\" (UID: \"6e043c44-ccec-451b-9ba3-505e49d89bce\") " pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.540995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wxz6\" (UniqueName: \"kubernetes.io/projected/6e043c44-ccec-451b-9ba3-505e49d89bce-kube-api-access-4wxz6\") pod \"glance-operator-index-jz5wd\" (UID: \"6e043c44-ccec-451b-9ba3-505e49d89bce\") " pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.612885 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:42 crc kubenswrapper[4762]: I0217 18:04:42.957525 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-kfgpn" podUID="3060adf8-6d22-4370-a102-6d1cd6b0914b" containerName="registry-server" containerID="cri-o://e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec" gracePeriod=2 Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.099144 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-jz5wd"] Feb 17 18:04:43 crc kubenswrapper[4762]: W0217 18:04:43.100430 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e043c44_ccec_451b_9ba3_505e49d89bce.slice/crio-6e5fe0fceaccc8edbed7e534f6baa562c5b25378cd882ba4a7455a130bb812a0 WatchSource:0}: Error finding container 6e5fe0fceaccc8edbed7e534f6baa562c5b25378cd882ba4a7455a130bb812a0: Status 404 returned error can't find the container with id 6e5fe0fceaccc8edbed7e534f6baa562c5b25378cd882ba4a7455a130bb812a0 Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.371851 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.527831 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6mq\" (UniqueName: \"kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq\") pod \"3060adf8-6d22-4370-a102-6d1cd6b0914b\" (UID: \"3060adf8-6d22-4370-a102-6d1cd6b0914b\") " Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.533290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq" (OuterVolumeSpecName: "kube-api-access-sf6mq") pod "3060adf8-6d22-4370-a102-6d1cd6b0914b" (UID: "3060adf8-6d22-4370-a102-6d1cd6b0914b"). InnerVolumeSpecName "kube-api-access-sf6mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.630180 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6mq\" (UniqueName: \"kubernetes.io/projected/3060adf8-6d22-4370-a102-6d1cd6b0914b-kube-api-access-sf6mq\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.731731 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:43 crc kubenswrapper[4762]: E0217 18:04:43.731946 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:43 crc kubenswrapper[4762]: E0217 18:04:43.731986 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:43 crc kubenswrapper[4762]: E0217 18:04:43.732062 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:04:51.732037715 +0000 UTC m=+1043.376955745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.967608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-jz5wd" event={"ID":"6e043c44-ccec-451b-9ba3-505e49d89bce","Type":"ContainerStarted","Data":"78774a08c051dc9cab36b6ba75ce5cf143dd4cbc8e01c0beaeb25bb2ef3d1c21"} Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.967675 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-jz5wd" event={"ID":"6e043c44-ccec-451b-9ba3-505e49d89bce","Type":"ContainerStarted","Data":"6e5fe0fceaccc8edbed7e534f6baa562c5b25378cd882ba4a7455a130bb812a0"} Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.969896 4762 generic.go:334] "Generic (PLEG): container finished" podID="3060adf8-6d22-4370-a102-6d1cd6b0914b" containerID="e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec" exitCode=0 Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.969991 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-kfgpn" event={"ID":"3060adf8-6d22-4370-a102-6d1cd6b0914b","Type":"ContainerDied","Data":"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec"} Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.970014 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-kfgpn" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.970041 4762 scope.go:117] "RemoveContainer" containerID="e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.970029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-kfgpn" event={"ID":"3060adf8-6d22-4370-a102-6d1cd6b0914b","Type":"ContainerDied","Data":"c2012531909740ab684c3088e8e11b0abec99de64b1a92e831b9cb76663f0723"} Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.984434 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-jz5wd" podStartSLOduration=1.4209251489999999 podStartE2EDuration="1.984419887s" podCreationTimestamp="2026-02-17 18:04:42 +0000 UTC" firstStartedPulling="2026-02-17 18:04:43.105427238 +0000 UTC m=+1034.750345248" lastFinishedPulling="2026-02-17 18:04:43.668921976 +0000 UTC m=+1035.313839986" observedRunningTime="2026-02-17 18:04:43.982165843 +0000 UTC m=+1035.627083863" watchObservedRunningTime="2026-02-17 18:04:43.984419887 +0000 UTC m=+1035.629337897" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.994184 4762 scope.go:117] "RemoveContainer" containerID="e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec" Feb 17 18:04:43 crc kubenswrapper[4762]: E0217 18:04:43.994853 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec\": container with ID starting with e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec not found: ID does not exist" containerID="e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec" Feb 17 18:04:43 crc kubenswrapper[4762]: I0217 18:04:43.994902 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec"} err="failed to get container status \"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec\": rpc error: code = NotFound desc = could not find container \"e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec\": container with ID starting with e190f6d18fd0f7a9d3863f41ec6a436c23904df52fff46e229f7fc1d26439fec not found: ID does not exist" Feb 17 18:04:44 crc kubenswrapper[4762]: I0217 18:04:44.000425 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:44 crc kubenswrapper[4762]: I0217 18:04:44.005222 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-kfgpn"] Feb 17 18:04:45 crc kubenswrapper[4762]: I0217 18:04:45.043788 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3060adf8-6d22-4370-a102-6d1cd6b0914b" path="/var/lib/kubelet/pods/3060adf8-6d22-4370-a102-6d1cd6b0914b/volumes" Feb 17 18:04:46 crc kubenswrapper[4762]: I0217 18:04:46.179424 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:46 crc kubenswrapper[4762]: E0217 18:04:46.179835 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:46 crc kubenswrapper[4762]: E0217 18:04:46.179871 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:46 crc kubenswrapper[4762]: E0217 18:04:46.179963 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:04:54.179926472 +0000 UTC m=+1045.824844502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:51 crc kubenswrapper[4762]: I0217 18:04:51.762461 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:04:51 crc kubenswrapper[4762]: E0217 18:04:51.762673 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:51 crc kubenswrapper[4762]: E0217 18:04:51.763216 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:04:51 crc kubenswrapper[4762]: E0217 18:04:51.763292 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:05:07.763273028 +0000 UTC m=+1059.408191038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:04:52 crc kubenswrapper[4762]: I0217 18:04:52.613373 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:52 crc kubenswrapper[4762]: I0217 18:04:52.613438 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:52 crc kubenswrapper[4762]: I0217 18:04:52.645668 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:53 crc kubenswrapper[4762]: I0217 18:04:53.052227 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-jz5wd" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.196541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:04:54 crc kubenswrapper[4762]: E0217 18:04:54.196675 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:04:54 crc kubenswrapper[4762]: E0217 18:04:54.196945 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:04:54 crc kubenswrapper[4762]: E0217 18:04:54.196993 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:05:10.196977709 +0000 UTC m=+1061.841895719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.931541 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8"] Feb 17 18:04:54 crc kubenswrapper[4762]: E0217 18:04:54.931823 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3060adf8-6d22-4370-a102-6d1cd6b0914b" containerName="registry-server" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.931834 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3060adf8-6d22-4370-a102-6d1cd6b0914b" containerName="registry-server" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.931956 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3060adf8-6d22-4370-a102-6d1cd6b0914b" containerName="registry-server" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.932889 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.935172 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ph6qt" Feb 17 18:04:54 crc kubenswrapper[4762]: I0217 18:04:54.943035 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8"] Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.006868 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.006965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gshk\" (UniqueName: \"kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.007055 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.108998 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.109068 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gshk\" (UniqueName: \"kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.109215 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.109809 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.110061 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.138157 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gshk\" (UniqueName: \"kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk\") pod \"e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.249157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:55 crc kubenswrapper[4762]: I0217 18:04:55.630722 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8"] Feb 17 18:04:56 crc kubenswrapper[4762]: I0217 18:04:56.049298 4762 generic.go:334] "Generic (PLEG): container finished" podID="2729907a-9375-4c68-ab91-8470b5e7965f" containerID="172416d7b4414ec851ae48a67453be7ef966c91b5cd0a50b12130384f7c2f47d" exitCode=0 Feb 17 18:04:56 crc kubenswrapper[4762]: I0217 18:04:56.049581 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" event={"ID":"2729907a-9375-4c68-ab91-8470b5e7965f","Type":"ContainerDied","Data":"172416d7b4414ec851ae48a67453be7ef966c91b5cd0a50b12130384f7c2f47d"} Feb 17 18:04:56 crc kubenswrapper[4762]: I0217 18:04:56.049617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" event={"ID":"2729907a-9375-4c68-ab91-8470b5e7965f","Type":"ContainerStarted","Data":"12cc8bdc3c356c5aa4404e98f98b3e3f9c7350b29a301fca20bb752207fecce5"} Feb 17 18:04:57 crc kubenswrapper[4762]: I0217 18:04:57.057092 4762 generic.go:334] "Generic (PLEG): container finished" podID="2729907a-9375-4c68-ab91-8470b5e7965f" containerID="6937d3f4dd17188a3616c33fe5df7a1465923828613d657adc635816728d0333" exitCode=0 Feb 17 18:04:57 crc kubenswrapper[4762]: I0217 18:04:57.057144 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" event={"ID":"2729907a-9375-4c68-ab91-8470b5e7965f","Type":"ContainerDied","Data":"6937d3f4dd17188a3616c33fe5df7a1465923828613d657adc635816728d0333"} Feb 17 18:04:58 crc kubenswrapper[4762]: I0217 18:04:58.065043 4762 generic.go:334] "Generic (PLEG): container finished" podID="2729907a-9375-4c68-ab91-8470b5e7965f" containerID="c2b51d040401334d9b5092c6fc3d09863583df973ac5c9c27c3208c2cd93fcfb" exitCode=0 Feb 17 18:04:58 crc kubenswrapper[4762]: I0217 18:04:58.065088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" event={"ID":"2729907a-9375-4c68-ab91-8470b5e7965f","Type":"ContainerDied","Data":"c2b51d040401334d9b5092c6fc3d09863583df973ac5c9c27c3208c2cd93fcfb"} Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.358195 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.469312 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util\") pod \"2729907a-9375-4c68-ab91-8470b5e7965f\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.469757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gshk\" (UniqueName: \"kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk\") pod \"2729907a-9375-4c68-ab91-8470b5e7965f\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.469845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle\") pod \"2729907a-9375-4c68-ab91-8470b5e7965f\" (UID: \"2729907a-9375-4c68-ab91-8470b5e7965f\") " Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.471096 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle" (OuterVolumeSpecName: "bundle") pod "2729907a-9375-4c68-ab91-8470b5e7965f" (UID: "2729907a-9375-4c68-ab91-8470b5e7965f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.475740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk" (OuterVolumeSpecName: "kube-api-access-7gshk") pod "2729907a-9375-4c68-ab91-8470b5e7965f" (UID: "2729907a-9375-4c68-ab91-8470b5e7965f"). InnerVolumeSpecName "kube-api-access-7gshk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.491854 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util" (OuterVolumeSpecName: "util") pod "2729907a-9375-4c68-ab91-8470b5e7965f" (UID: "2729907a-9375-4c68-ab91-8470b5e7965f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.573732 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-util\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.573771 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gshk\" (UniqueName: \"kubernetes.io/projected/2729907a-9375-4c68-ab91-8470b5e7965f-kube-api-access-7gshk\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:59 crc kubenswrapper[4762]: I0217 18:04:59.573784 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2729907a-9375-4c68-ab91-8470b5e7965f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:00 crc kubenswrapper[4762]: I0217 18:05:00.080752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" event={"ID":"2729907a-9375-4c68-ab91-8470b5e7965f","Type":"ContainerDied","Data":"12cc8bdc3c356c5aa4404e98f98b3e3f9c7350b29a301fca20bb752207fecce5"} Feb 17 18:05:00 crc kubenswrapper[4762]: I0217 18:05:00.080794 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12cc8bdc3c356c5aa4404e98f98b3e3f9c7350b29a301fca20bb752207fecce5" Feb 17 18:05:00 crc kubenswrapper[4762]: I0217 18:05:00.080897 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8" Feb 17 18:05:07 crc kubenswrapper[4762]: I0217 18:05:07.836033 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:05:07 crc kubenswrapper[4762]: E0217 18:05:07.836262 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:05:07 crc kubenswrapper[4762]: E0217 18:05:07.836726 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:05:07 crc kubenswrapper[4762]: E0217 18:05:07.836780 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:05:39.836761836 +0000 UTC m=+1091.481679836 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:05:10 crc kubenswrapper[4762]: I0217 18:05:10.295321 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:05:10 crc kubenswrapper[4762]: E0217 18:05:10.295489 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:05:10 crc kubenswrapper[4762]: E0217 18:05:10.295811 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:05:10 crc kubenswrapper[4762]: E0217 18:05:10.295875 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:05:42.295854252 +0000 UTC m=+1093.940772262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.807206 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c"] Feb 17 18:05:13 crc kubenswrapper[4762]: E0217 18:05:13.808232 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="extract" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.808250 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="extract" Feb 17 18:05:13 crc kubenswrapper[4762]: E0217 18:05:13.808266 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="util" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.808273 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="util" Feb 17 18:05:13 crc kubenswrapper[4762]: E0217 18:05:13.808284 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="pull" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.808292 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="pull" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.808451 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2729907a-9375-4c68-ab91-8470b5e7965f" containerName="extract" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.809054 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.813076 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.813655 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lgz5g" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.829978 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c"] Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.949395 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-webhook-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.949525 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-apiservice-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:13 crc kubenswrapper[4762]: I0217 18:05:13.949589 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmhk\" (UniqueName: \"kubernetes.io/projected/bdf1f157-1721-40cf-9c1b-288bb8190904-kube-api-access-gqmhk\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.051235 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-apiservice-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.051319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmhk\" (UniqueName: \"kubernetes.io/projected/bdf1f157-1721-40cf-9c1b-288bb8190904-kube-api-access-gqmhk\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.051349 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-webhook-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.064729 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-apiservice-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.066316 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bdf1f157-1721-40cf-9c1b-288bb8190904-webhook-cert\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.068780 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqmhk\" (UniqueName: \"kubernetes.io/projected/bdf1f157-1721-40cf-9c1b-288bb8190904-kube-api-access-gqmhk\") pod \"glance-operator-controller-manager-55b99585d6-r8h5c\" (UID: \"bdf1f157-1721-40cf-9c1b-288bb8190904\") " pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.135010 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:14 crc kubenswrapper[4762]: I0217 18:05:14.682610 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c"] Feb 17 18:05:15 crc kubenswrapper[4762]: I0217 18:05:15.194987 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" event={"ID":"bdf1f157-1721-40cf-9c1b-288bb8190904","Type":"ContainerStarted","Data":"c6565ce67bfb874f67946be24191e735e0dbd15af79f126c2d3dd8c7bf74a16d"} Feb 17 18:05:20 crc kubenswrapper[4762]: I0217 18:05:20.235794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" event={"ID":"bdf1f157-1721-40cf-9c1b-288bb8190904","Type":"ContainerStarted","Data":"d05255580a2bdab203e689c1e08dd9c0379b9c28fd871bfc3fb3fece8a6bd523"} Feb 17 18:05:20 crc kubenswrapper[4762]: I0217 18:05:20.236411 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:20 crc kubenswrapper[4762]: I0217 18:05:20.257894 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" podStartSLOduration=2.880438004 podStartE2EDuration="7.257876885s" podCreationTimestamp="2026-02-17 18:05:13 +0000 UTC" firstStartedPulling="2026-02-17 18:05:14.695908848 +0000 UTC m=+1066.340826858" lastFinishedPulling="2026-02-17 18:05:19.073347719 +0000 UTC m=+1070.718265739" observedRunningTime="2026-02-17 18:05:20.256636069 +0000 UTC m=+1071.901554079" watchObservedRunningTime="2026-02-17 18:05:20.257876885 +0000 UTC m=+1071.902794895" Feb 17 18:05:24 crc kubenswrapper[4762]: I0217 18:05:24.144617 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-55b99585d6-r8h5c" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.840949 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.842270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.843807 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.844516 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.844930 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.848363 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-h7wsp" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.853257 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.884396 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-zd56k"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.888681 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.895774 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.896924 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.898368 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.900769 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-zd56k"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.907640 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd"] Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.928966 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6bl9\" (UniqueName: \"kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929028 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sbp\" (UniqueName: \"kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929099 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929224 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929247 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjd7\" (UniqueName: \"kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929311 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929333 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:25 crc kubenswrapper[4762]: I0217 18:05:25.929388 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030340 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6bl9\" (UniqueName: \"kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030437 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sbp\" (UniqueName: \"kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030556 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjd7\" (UniqueName: \"kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.030683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.031580 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.031681 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.031789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.031812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.046958 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.049082 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6bl9\" (UniqueName: \"kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9\") pod \"glance-db-create-zd56k\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.056046 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sbp\" (UniqueName: \"kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp\") pod \"openstackclient\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.059608 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjd7\" (UniqueName: \"kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7\") pod \"glance-eb6d-account-create-update-ltlzd\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.158639 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.205042 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.211304 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.649340 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.658958 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-zd56k"] Feb 17 18:05:26 crc kubenswrapper[4762]: W0217 18:05:26.668728 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded0a0e07_c833_44d2_bb21_7ff75db80be1.slice/crio-e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9 WatchSource:0}: Error finding container e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9: Status 404 returned error can't find the container with id e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9 Feb 17 18:05:26 crc kubenswrapper[4762]: I0217 18:05:26.698779 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd"] Feb 17 18:05:26 crc kubenswrapper[4762]: W0217 18:05:26.702363 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c1aaaf_96e7_4356_9107_7adcd9cad2df.slice/crio-038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4 WatchSource:0}: Error finding container 038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4: Status 404 returned error can't find the container with id 038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4 Feb 17 18:05:27 crc kubenswrapper[4762]: I0217 18:05:27.304243 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" event={"ID":"39c1aaaf-96e7-4356-9107-7adcd9cad2df","Type":"ContainerStarted","Data":"038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4"} Feb 17 18:05:27 crc kubenswrapper[4762]: I0217 18:05:27.306685 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b9827e91-a646-4485-9117-e72e23035b7c","Type":"ContainerStarted","Data":"b63a5ed1d547fd7dd9af6386e8b7424ae0e90db1481aa17378647623910e06aa"} Feb 17 18:05:27 crc kubenswrapper[4762]: I0217 18:05:27.308049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-zd56k" event={"ID":"ed0a0e07-c833-44d2-bb21-7ff75db80be1","Type":"ContainerStarted","Data":"021a4ca038c82f7b4a3651311de17a17a2a68d0d2baf7f4c92b48abcdac9185a"} Feb 17 18:05:27 crc kubenswrapper[4762]: I0217 18:05:27.308069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-zd56k" event={"ID":"ed0a0e07-c833-44d2-bb21-7ff75db80be1","Type":"ContainerStarted","Data":"e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9"} Feb 17 18:05:28 crc kubenswrapper[4762]: I0217 18:05:28.317028 4762 generic.go:334] "Generic (PLEG): container finished" podID="ed0a0e07-c833-44d2-bb21-7ff75db80be1" containerID="021a4ca038c82f7b4a3651311de17a17a2a68d0d2baf7f4c92b48abcdac9185a" exitCode=0 Feb 17 18:05:28 crc kubenswrapper[4762]: I0217 18:05:28.317127 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-zd56k" event={"ID":"ed0a0e07-c833-44d2-bb21-7ff75db80be1","Type":"ContainerDied","Data":"021a4ca038c82f7b4a3651311de17a17a2a68d0d2baf7f4c92b48abcdac9185a"} Feb 17 18:05:28 crc kubenswrapper[4762]: I0217 18:05:28.319082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" event={"ID":"39c1aaaf-96e7-4356-9107-7adcd9cad2df","Type":"ContainerStarted","Data":"51ec8b176ceb5176438c0db7debb8f393fbcdcd2f3b380a83f2788441c519854"} Feb 17 18:05:28 crc kubenswrapper[4762]: I0217 18:05:28.358876 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" podStartSLOduration=3.358856535 podStartE2EDuration="3.358856535s" podCreationTimestamp="2026-02-17 18:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:05:28.354922993 +0000 UTC m=+1079.999841013" watchObservedRunningTime="2026-02-17 18:05:28.358856535 +0000 UTC m=+1080.003774545" Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.328399 4762 generic.go:334] "Generic (PLEG): container finished" podID="39c1aaaf-96e7-4356-9107-7adcd9cad2df" containerID="51ec8b176ceb5176438c0db7debb8f393fbcdcd2f3b380a83f2788441c519854" exitCode=0 Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.328488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" event={"ID":"39c1aaaf-96e7-4356-9107-7adcd9cad2df","Type":"ContainerDied","Data":"51ec8b176ceb5176438c0db7debb8f393fbcdcd2f3b380a83f2788441c519854"} Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.655077 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.784458 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6bl9\" (UniqueName: \"kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9\") pod \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.784601 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts\") pod \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\" (UID: \"ed0a0e07-c833-44d2-bb21-7ff75db80be1\") " Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.785699 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed0a0e07-c833-44d2-bb21-7ff75db80be1" (UID: "ed0a0e07-c833-44d2-bb21-7ff75db80be1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.792798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9" (OuterVolumeSpecName: "kube-api-access-p6bl9") pod "ed0a0e07-c833-44d2-bb21-7ff75db80be1" (UID: "ed0a0e07-c833-44d2-bb21-7ff75db80be1"). InnerVolumeSpecName "kube-api-access-p6bl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.886517 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6bl9\" (UniqueName: \"kubernetes.io/projected/ed0a0e07-c833-44d2-bb21-7ff75db80be1-kube-api-access-p6bl9\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:29 crc kubenswrapper[4762]: I0217 18:05:29.886559 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0a0e07-c833-44d2-bb21-7ff75db80be1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.335718 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-zd56k" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.335742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-zd56k" event={"ID":"ed0a0e07-c833-44d2-bb21-7ff75db80be1","Type":"ContainerDied","Data":"e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9"} Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.335902 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7898d5ecf43cdf042b5ee79d15350ba8ca9a816574cd0c6341252a0c80b64e9" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.648024 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.699392 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfjd7\" (UniqueName: \"kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7\") pod \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.699586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts\") pod \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\" (UID: \"39c1aaaf-96e7-4356-9107-7adcd9cad2df\") " Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.700397 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39c1aaaf-96e7-4356-9107-7adcd9cad2df" (UID: "39c1aaaf-96e7-4356-9107-7adcd9cad2df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.703989 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7" (OuterVolumeSpecName: "kube-api-access-lfjd7") pod "39c1aaaf-96e7-4356-9107-7adcd9cad2df" (UID: "39c1aaaf-96e7-4356-9107-7adcd9cad2df"). InnerVolumeSpecName "kube-api-access-lfjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.801886 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfjd7\" (UniqueName: \"kubernetes.io/projected/39c1aaaf-96e7-4356-9107-7adcd9cad2df-kube-api-access-lfjd7\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:30 crc kubenswrapper[4762]: I0217 18:05:30.801934 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c1aaaf-96e7-4356-9107-7adcd9cad2df-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:31 crc kubenswrapper[4762]: I0217 18:05:31.343817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" event={"ID":"39c1aaaf-96e7-4356-9107-7adcd9cad2df","Type":"ContainerDied","Data":"038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4"} Feb 17 18:05:31 crc kubenswrapper[4762]: I0217 18:05:31.343863 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="038454515b4d8d10c9b2e912fb8c7aa36c1e4e7e419131905689a92b5d4121f4" Feb 17 18:05:31 crc kubenswrapper[4762]: I0217 18:05:31.343896 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.124348 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-tdkb5"] Feb 17 18:05:36 crc kubenswrapper[4762]: E0217 18:05:36.125540 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0a0e07-c833-44d2-bb21-7ff75db80be1" containerName="mariadb-database-create" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.125579 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0a0e07-c833-44d2-bb21-7ff75db80be1" containerName="mariadb-database-create" Feb 17 18:05:36 crc kubenswrapper[4762]: E0217 18:05:36.125593 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c1aaaf-96e7-4356-9107-7adcd9cad2df" containerName="mariadb-account-create-update" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.125602 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c1aaaf-96e7-4356-9107-7adcd9cad2df" containerName="mariadb-account-create-update" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.125843 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0a0e07-c833-44d2-bb21-7ff75db80be1" containerName="mariadb-database-create" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.125861 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c1aaaf-96e7-4356-9107-7adcd9cad2df" containerName="mariadb-account-create-update" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.126557 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.130190 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2srwp" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.130497 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.131336 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tdkb5"] Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.282615 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.283013 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.283088 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gt6r\" (UniqueName: \"kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.384122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.384210 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.384264 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gt6r\" (UniqueName: \"kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.390346 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.390696 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.404113 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gt6r\" (UniqueName: \"kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r\") pod \"glance-db-sync-tdkb5\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:36 crc kubenswrapper[4762]: I0217 18:05:36.453683 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:37 crc kubenswrapper[4762]: I0217 18:05:37.517851 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tdkb5"] Feb 17 18:05:37 crc kubenswrapper[4762]: W0217 18:05:37.521418 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20eb84c0_e935_4918_9ced_8bff7e0a4245.slice/crio-f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1 WatchSource:0}: Error finding container f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1: Status 404 returned error can't find the container with id f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1 Feb 17 18:05:38 crc kubenswrapper[4762]: I0217 18:05:38.398474 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b9827e91-a646-4485-9117-e72e23035b7c","Type":"ContainerStarted","Data":"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85"} Feb 17 18:05:38 crc kubenswrapper[4762]: I0217 18:05:38.400335 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tdkb5" event={"ID":"20eb84c0-e935-4918-9ced-8bff7e0a4245","Type":"ContainerStarted","Data":"f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1"} Feb 17 18:05:38 crc kubenswrapper[4762]: I0217 18:05:38.416608 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.600076622 podStartE2EDuration="13.416590755s" podCreationTimestamp="2026-02-17 18:05:25 +0000 UTC" firstStartedPulling="2026-02-17 18:05:26.670185833 +0000 UTC m=+1078.315103843" lastFinishedPulling="2026-02-17 18:05:37.486699966 +0000 UTC m=+1089.131617976" observedRunningTime="2026-02-17 18:05:38.413382223 +0000 UTC m=+1090.058300243" watchObservedRunningTime="2026-02-17 18:05:38.416590755 +0000 UTC m=+1090.061508765" Feb 17 18:05:39 crc kubenswrapper[4762]: I0217 18:05:39.935673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:05:39 crc kubenswrapper[4762]: E0217 18:05:39.936155 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:05:39 crc kubenswrapper[4762]: E0217 18:05:39.936171 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:05:39 crc kubenswrapper[4762]: E0217 18:05:39.936245 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:06:43.936228939 +0000 UTC m=+1155.581146949 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:05:42 crc kubenswrapper[4762]: I0217 18:05:42.371521 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:05:42 crc kubenswrapper[4762]: E0217 18:05:42.371829 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:05:42 crc kubenswrapper[4762]: E0217 18:05:42.372123 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:05:42 crc kubenswrapper[4762]: E0217 18:05:42.372180 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:06:46.372161515 +0000 UTC m=+1158.017079525 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:05:48 crc kubenswrapper[4762]: I0217 18:05:48.471082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tdkb5" event={"ID":"20eb84c0-e935-4918-9ced-8bff7e0a4245","Type":"ContainerStarted","Data":"9386635a8c4128ae781d33d8791c9d6ee8abef8b0c793ea20872f6424757af25"} Feb 17 18:05:48 crc kubenswrapper[4762]: I0217 18:05:48.492159 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-tdkb5" podStartSLOduration=2.081813915 podStartE2EDuration="12.492138701s" podCreationTimestamp="2026-02-17 18:05:36 +0000 UTC" firstStartedPulling="2026-02-17 18:05:37.525379828 +0000 UTC m=+1089.170297858" lastFinishedPulling="2026-02-17 18:05:47.935704634 +0000 UTC m=+1099.580622644" observedRunningTime="2026-02-17 18:05:48.488726794 +0000 UTC m=+1100.133644814" watchObservedRunningTime="2026-02-17 18:05:48.492138701 +0000 UTC m=+1100.137056721" Feb 17 18:05:58 crc kubenswrapper[4762]: I0217 18:05:58.544917 4762 generic.go:334] "Generic (PLEG): container finished" podID="20eb84c0-e935-4918-9ced-8bff7e0a4245" containerID="9386635a8c4128ae781d33d8791c9d6ee8abef8b0c793ea20872f6424757af25" exitCode=0 Feb 17 18:05:58 crc kubenswrapper[4762]: I0217 18:05:58.544972 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tdkb5" event={"ID":"20eb84c0-e935-4918-9ced-8bff7e0a4245","Type":"ContainerDied","Data":"9386635a8c4128ae781d33d8791c9d6ee8abef8b0c793ea20872f6424757af25"} Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.803202 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.950878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data\") pod \"20eb84c0-e935-4918-9ced-8bff7e0a4245\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.950947 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data\") pod \"20eb84c0-e935-4918-9ced-8bff7e0a4245\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.950964 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gt6r\" (UniqueName: \"kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r\") pod \"20eb84c0-e935-4918-9ced-8bff7e0a4245\" (UID: \"20eb84c0-e935-4918-9ced-8bff7e0a4245\") " Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.956283 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r" (OuterVolumeSpecName: "kube-api-access-5gt6r") pod "20eb84c0-e935-4918-9ced-8bff7e0a4245" (UID: "20eb84c0-e935-4918-9ced-8bff7e0a4245"). InnerVolumeSpecName "kube-api-access-5gt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.956455 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20eb84c0-e935-4918-9ced-8bff7e0a4245" (UID: "20eb84c0-e935-4918-9ced-8bff7e0a4245"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:59 crc kubenswrapper[4762]: I0217 18:05:59.988798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data" (OuterVolumeSpecName: "config-data") pod "20eb84c0-e935-4918-9ced-8bff7e0a4245" (UID: "20eb84c0-e935-4918-9ced-8bff7e0a4245"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.052560 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.052596 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eb84c0-e935-4918-9ced-8bff7e0a4245-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.052608 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gt6r\" (UniqueName: \"kubernetes.io/projected/20eb84c0-e935-4918-9ced-8bff7e0a4245-kube-api-access-5gt6r\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.559080 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tdkb5" event={"ID":"20eb84c0-e935-4918-9ced-8bff7e0a4245","Type":"ContainerDied","Data":"f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1"} Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.559124 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4940f075db39722eae3a5d8380013ba5544b07f058f18419b3314fe12a446d1" Feb 17 18:06:00 crc kubenswrapper[4762]: I0217 18:06:00.559185 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tdkb5" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.102207 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:01 crc kubenswrapper[4762]: E0217 18:06:01.102784 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20eb84c0-e935-4918-9ced-8bff7e0a4245" containerName="glance-db-sync" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.102798 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="20eb84c0-e935-4918-9ced-8bff7e0a4245" containerName="glance-db-sync" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.102929 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="20eb84c0-e935-4918-9ced-8bff7e0a4245" containerName="glance-db-sync" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.103586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.105460 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2srwp" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.105514 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.105466 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.120519 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.121836 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.127496 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.151309 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268652 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268668 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b5cx\" (UniqueName: \"kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268685 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268723 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkb5\" (UniqueName: \"kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268751 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268788 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268816 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268843 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268859 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268950 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268973 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.268994 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269014 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269070 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269154 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269251 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269285 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269326 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269346 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269390 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269437 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269462 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269534 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.269561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371518 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371538 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b5cx\" (UniqueName: \"kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371608 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371602 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkb5\" (UniqueName: \"kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371713 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371780 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371818 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371922 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371994 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372031 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.371650 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372119 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372158 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372661 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372902 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373208 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373395 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373584 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373701 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373822 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.373921 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.378071 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.378429 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.378495 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.378835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.372844 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.381481 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.381559 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.381592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.381643 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.381828 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.382130 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.385089 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.385209 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.385723 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.391360 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b5cx\" (UniqueName: \"kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.392296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkb5\" (UniqueName: \"kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.394910 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.398519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.406667 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.406873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.420505 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.477300 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.839713 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:01 crc kubenswrapper[4762]: I0217 18:06:01.958301 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:01 crc kubenswrapper[4762]: W0217 18:06:01.966574 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d65fe2f_0088_486f_ad7a_7f0b0a905986.slice/crio-6de2164e105c9a28433c0755962c357fa39ed53bd4aa8574bd57cea2315efc35 WatchSource:0}: Error finding container 6de2164e105c9a28433c0755962c357fa39ed53bd4aa8574bd57cea2315efc35: Status 404 returned error can't find the container with id 6de2164e105c9a28433c0755962c357fa39ed53bd4aa8574bd57cea2315efc35 Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.051556 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.590427 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerStarted","Data":"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.590706 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerStarted","Data":"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.590683 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-httpd" containerID="cri-o://85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" gracePeriod=30 Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.590718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerStarted","Data":"6de2164e105c9a28433c0755962c357fa39ed53bd4aa8574bd57cea2315efc35"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.590538 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-log" containerID="cri-o://7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" gracePeriod=30 Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.612679 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerStarted","Data":"ea08be733184e283bcc0db5bfefcbf5c92d65a84bbecef38a52390235f21529b"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.612738 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerStarted","Data":"b109a0337e4a255a8130282ec66819231dc1311f1ebd9e1d08a346f41f13c043"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.612755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerStarted","Data":"15f4854561ace4195c25f099a54ea649f6a76488547143ce7a75cb2f96dbf2f8"} Feb 17 18:06:02 crc kubenswrapper[4762]: I0217 18:06:02.629943 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=1.629926004 podStartE2EDuration="1.629926004s" podCreationTimestamp="2026-02-17 18:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:02.625106578 +0000 UTC m=+1114.270024588" watchObservedRunningTime="2026-02-17 18:06:02.629926004 +0000 UTC m=+1114.274844014" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.013452 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.040767 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.040752097 podStartE2EDuration="2.040752097s" podCreationTimestamp="2026-02-17 18:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:02.671887666 +0000 UTC m=+1114.316805686" watchObservedRunningTime="2026-02-17 18:06:03.040752097 +0000 UTC m=+1114.685670107" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101197 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101293 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqkb5\" (UniqueName: \"kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101322 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101372 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101417 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101438 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101466 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101497 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys" (OuterVolumeSpecName: "sys") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101574 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101573 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run" (OuterVolumeSpecName: "run") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101604 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101723 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101700 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101794 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101818 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs" (OuterVolumeSpecName: "logs") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101827 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules\") pod \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\" (UID: \"3d65fe2f-0088-486f-ad7a-7f0b0a905986\") " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101852 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.101884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev" (OuterVolumeSpecName: "dev") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.102231 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103317 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103347 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103360 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103372 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103384 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103395 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65fe2f-0088-486f-ad7a-7f0b0a905986-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103406 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103417 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.103428 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d65fe2f-0088-486f-ad7a-7f0b0a905986-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.106497 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5" (OuterVolumeSpecName: "kube-api-access-hqkb5") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "kube-api-access-hqkb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.106780 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts" (OuterVolumeSpecName: "scripts") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.107078 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.107254 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.136740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data" (OuterVolumeSpecName: "config-data") pod "3d65fe2f-0088-486f-ad7a-7f0b0a905986" (UID: "3d65fe2f-0088-486f-ad7a-7f0b0a905986"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.204790 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.205057 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqkb5\" (UniqueName: \"kubernetes.io/projected/3d65fe2f-0088-486f-ad7a-7f0b0a905986-kube-api-access-hqkb5\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.205090 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.205104 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.205114 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65fe2f-0088-486f-ad7a-7f0b0a905986-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.216971 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.217586 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.306711 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.306743 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.624937 4762 generic.go:334] "Generic (PLEG): container finished" podID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerID="85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" exitCode=143 Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.624973 4762 generic.go:334] "Generic (PLEG): container finished" podID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerID="7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" exitCode=143 Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.624994 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerDied","Data":"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20"} Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.625056 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerDied","Data":"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba"} Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.625072 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d65fe2f-0088-486f-ad7a-7f0b0a905986","Type":"ContainerDied","Data":"6de2164e105c9a28433c0755962c357fa39ed53bd4aa8574bd57cea2315efc35"} Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.625092 4762 scope.go:117] "RemoveContainer" containerID="85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.626279 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.651492 4762 scope.go:117] "RemoveContainer" containerID="7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.662619 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.673478 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.679266 4762 scope.go:117] "RemoveContainer" containerID="85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.684913 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:03 crc kubenswrapper[4762]: E0217 18:06:03.685558 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-httpd" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.685584 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-httpd" Feb 17 18:06:03 crc kubenswrapper[4762]: E0217 18:06:03.685607 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-log" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.685651 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-log" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.686054 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-httpd" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.686094 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" containerName="glance-log" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.687477 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: E0217 18:06:03.690921 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20\": container with ID starting with 85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20 not found: ID does not exist" containerID="85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.690955 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20"} err="failed to get container status \"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20\": rpc error: code = NotFound desc = could not find container \"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20\": container with ID starting with 85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20 not found: ID does not exist" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.690985 4762 scope.go:117] "RemoveContainer" containerID="7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" Feb 17 18:06:03 crc kubenswrapper[4762]: E0217 18:06:03.691788 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba\": container with ID starting with 7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba not found: ID does not exist" containerID="7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.691829 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba"} err="failed to get container status \"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba\": rpc error: code = NotFound desc = could not find container \"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba\": container with ID starting with 7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba not found: ID does not exist" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.691856 4762 scope.go:117] "RemoveContainer" containerID="85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.692305 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20"} err="failed to get container status \"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20\": rpc error: code = NotFound desc = could not find container \"85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20\": container with ID starting with 85d4ccfbc7e6eabe588a567a6f8e84674ba433d9e633dcb1569dab7d48ac2a20 not found: ID does not exist" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.692351 4762 scope.go:117] "RemoveContainer" containerID="7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.692580 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba"} err="failed to get container status \"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba\": rpc error: code = NotFound desc = could not find container \"7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba\": container with ID starting with 7e3c5cd54aed3fddb664be1e01c86f1c48a076f8ce53fa0927a7a500f8d48cba not found: ID does not exist" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.704378 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819463 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819499 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819523 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819573 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819654 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819698 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819724 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxfl\" (UniqueName: \"kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.819968 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.820032 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.820063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.820102 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.820183 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.921928 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.921979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxfl\" (UniqueName: \"kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922020 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922068 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922089 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922133 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922294 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922363 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922433 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922444 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922441 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922593 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922667 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.922846 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.923021 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.923133 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.929015 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.932405 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.942203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxfl\" (UniqueName: \"kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.942607 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:03 crc kubenswrapper[4762]: I0217 18:06:03.953693 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:04 crc kubenswrapper[4762]: I0217 18:06:04.054308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:04 crc kubenswrapper[4762]: I0217 18:06:04.493489 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:04 crc kubenswrapper[4762]: W0217 18:06:04.506154 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b87e3b_550c_4f8b_ba57_cab19b9b5111.slice/crio-606c3e8dd073dc0ed6636a1f65d7c09dedb0bb705624c6332e1870554029abb3 WatchSource:0}: Error finding container 606c3e8dd073dc0ed6636a1f65d7c09dedb0bb705624c6332e1870554029abb3: Status 404 returned error can't find the container with id 606c3e8dd073dc0ed6636a1f65d7c09dedb0bb705624c6332e1870554029abb3 Feb 17 18:06:04 crc kubenswrapper[4762]: I0217 18:06:04.558366 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:06:04 crc kubenswrapper[4762]: I0217 18:06:04.558425 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:06:04 crc kubenswrapper[4762]: I0217 18:06:04.635070 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerStarted","Data":"606c3e8dd073dc0ed6636a1f65d7c09dedb0bb705624c6332e1870554029abb3"} Feb 17 18:06:05 crc kubenswrapper[4762]: I0217 18:06:05.044306 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d65fe2f-0088-486f-ad7a-7f0b0a905986" path="/var/lib/kubelet/pods/3d65fe2f-0088-486f-ad7a-7f0b0a905986/volumes" Feb 17 18:06:05 crc kubenswrapper[4762]: I0217 18:06:05.645377 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerStarted","Data":"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9"} Feb 17 18:06:05 crc kubenswrapper[4762]: I0217 18:06:05.645435 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerStarted","Data":"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b"} Feb 17 18:06:05 crc kubenswrapper[4762]: I0217 18:06:05.680184 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.680162214 podStartE2EDuration="2.680162214s" podCreationTimestamp="2026-02-17 18:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:05.674769181 +0000 UTC m=+1117.319687211" watchObservedRunningTime="2026-02-17 18:06:05.680162214 +0000 UTC m=+1117.325080224" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.421289 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.421597 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.446697 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.459036 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.872342 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:11 crc kubenswrapper[4762]: I0217 18:06:11.872814 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:13 crc kubenswrapper[4762]: I0217 18:06:13.884990 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:06:13 crc kubenswrapper[4762]: I0217 18:06:13.885330 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:06:13 crc kubenswrapper[4762]: I0217 18:06:13.917915 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:13 crc kubenswrapper[4762]: I0217 18:06:13.918734 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.055403 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.055452 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.081386 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.095329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.891661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:14 crc kubenswrapper[4762]: I0217 18:06:14.892054 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:16 crc kubenswrapper[4762]: I0217 18:06:16.892179 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:16 crc kubenswrapper[4762]: I0217 18:06:16.903162 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:06:16 crc kubenswrapper[4762]: I0217 18:06:16.966507 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:17 crc kubenswrapper[4762]: I0217 18:06:17.056119 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:17 crc kubenswrapper[4762]: I0217 18:06:17.056433 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-log" containerID="cri-o://b109a0337e4a255a8130282ec66819231dc1311f1ebd9e1d08a346f41f13c043" gracePeriod=30 Feb 17 18:06:17 crc kubenswrapper[4762]: I0217 18:06:17.056969 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-httpd" containerID="cri-o://ea08be733184e283bcc0db5bfefcbf5c92d65a84bbecef38a52390235f21529b" gracePeriod=30 Feb 17 18:06:17 crc kubenswrapper[4762]: I0217 18:06:17.911532 4762 generic.go:334] "Generic (PLEG): container finished" podID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerID="b109a0337e4a255a8130282ec66819231dc1311f1ebd9e1d08a346f41f13c043" exitCode=143 Feb 17 18:06:17 crc kubenswrapper[4762]: I0217 18:06:17.911674 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerDied","Data":"b109a0337e4a255a8130282ec66819231dc1311f1ebd9e1d08a346f41f13c043"} Feb 17 18:06:20 crc kubenswrapper[4762]: I0217 18:06:20.932237 4762 generic.go:334] "Generic (PLEG): container finished" podID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerID="ea08be733184e283bcc0db5bfefcbf5c92d65a84bbecef38a52390235f21529b" exitCode=0 Feb 17 18:06:20 crc kubenswrapper[4762]: I0217 18:06:20.932440 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerDied","Data":"ea08be733184e283bcc0db5bfefcbf5c92d65a84bbecef38a52390235f21529b"} Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.094070 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220494 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220554 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220583 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220612 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220647 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220683 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220698 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220769 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220797 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220819 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220876 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b5cx\" (UniqueName: \"kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.220925 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi\") pod \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\" (UID: \"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb\") " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.221275 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.221305 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.221527 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.221552 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.222412 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev" (OuterVolumeSpecName: "dev") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.222492 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys" (OuterVolumeSpecName: "sys") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.222500 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.222555 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run" (OuterVolumeSpecName: "run") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.222778 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs" (OuterVolumeSpecName: "logs") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.227528 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx" (OuterVolumeSpecName: "kube-api-access-7b5cx") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "kube-api-access-7b5cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.227632 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.228008 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.228373 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts" (OuterVolumeSpecName: "scripts") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.259444 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data" (OuterVolumeSpecName: "config-data") pod "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" (UID: "0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322592 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322639 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322650 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322660 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322670 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322681 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b5cx\" (UniqueName: \"kubernetes.io/projected/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-kube-api-access-7b5cx\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322693 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322703 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322711 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322722 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322754 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322765 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322775 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.322790 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.334142 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.339150 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.424005 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.424052 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.940402 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb","Type":"ContainerDied","Data":"15f4854561ace4195c25f099a54ea649f6a76488547143ce7a75cb2f96dbf2f8"} Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.940480 4762 scope.go:117] "RemoveContainer" containerID="ea08be733184e283bcc0db5bfefcbf5c92d65a84bbecef38a52390235f21529b" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.940733 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.961742 4762 scope.go:117] "RemoveContainer" containerID="b109a0337e4a255a8130282ec66819231dc1311f1ebd9e1d08a346f41f13c043" Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.977469 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:21 crc kubenswrapper[4762]: I0217 18:06:21.984007 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.001896 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:22 crc kubenswrapper[4762]: E0217 18:06:22.002144 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-log" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.002156 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-log" Feb 17 18:06:22 crc kubenswrapper[4762]: E0217 18:06:22.002172 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-httpd" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.002178 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-httpd" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.002314 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-httpd" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.002328 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" containerName="glance-log" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.002992 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.018359 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134155 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134203 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134266 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134413 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134528 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134576 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjk9b\" (UniqueName: \"kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134702 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134732 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134757 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134826 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134866 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.134903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236064 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236226 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236644 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236693 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236786 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236851 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjk9b\" (UniqueName: \"kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236955 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.236976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237040 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237166 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237223 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237228 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237380 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237391 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237569 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.237673 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.242340 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.242415 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.255332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjk9b\" (UniqueName: \"kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.261027 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.269116 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.316435 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.734468 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:22 crc kubenswrapper[4762]: W0217 18:06:22.745144 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfeff88c_b91a_496f_a4da_f82c9eb10472.slice/crio-67d8e2e96f418b175a0cb62fc3605285d30ef1862f982e8a02a13e6c86cf2625 WatchSource:0}: Error finding container 67d8e2e96f418b175a0cb62fc3605285d30ef1862f982e8a02a13e6c86cf2625: Status 404 returned error can't find the container with id 67d8e2e96f418b175a0cb62fc3605285d30ef1862f982e8a02a13e6c86cf2625 Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.950603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerStarted","Data":"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea"} Feb 17 18:06:22 crc kubenswrapper[4762]: I0217 18:06:22.951785 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerStarted","Data":"67d8e2e96f418b175a0cb62fc3605285d30ef1862f982e8a02a13e6c86cf2625"} Feb 17 18:06:23 crc kubenswrapper[4762]: I0217 18:06:23.045972 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb" path="/var/lib/kubelet/pods/0d3f5e29-76e0-48f8-b33d-b08fa79ad7bb/volumes" Feb 17 18:06:23 crc kubenswrapper[4762]: I0217 18:06:23.960670 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerStarted","Data":"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde"} Feb 17 18:06:23 crc kubenswrapper[4762]: I0217 18:06:23.985420 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.985396793 podStartE2EDuration="2.985396793s" podCreationTimestamp="2026-02-17 18:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:23.981454841 +0000 UTC m=+1135.626372851" watchObservedRunningTime="2026-02-17 18:06:23.985396793 +0000 UTC m=+1135.630314803" Feb 17 18:06:32 crc kubenswrapper[4762]: I0217 18:06:32.317134 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:32 crc kubenswrapper[4762]: I0217 18:06:32.318947 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:32 crc kubenswrapper[4762]: I0217 18:06:32.359323 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:32 crc kubenswrapper[4762]: I0217 18:06:32.364062 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:33 crc kubenswrapper[4762]: I0217 18:06:33.020250 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:33 crc kubenswrapper[4762]: I0217 18:06:33.020934 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:34 crc kubenswrapper[4762]: I0217 18:06:34.558003 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:06:34 crc kubenswrapper[4762]: I0217 18:06:34.558313 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:06:35 crc kubenswrapper[4762]: I0217 18:06:35.031369 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:06:35 crc kubenswrapper[4762]: I0217 18:06:35.031394 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:06:35 crc kubenswrapper[4762]: I0217 18:06:35.136100 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:35 crc kubenswrapper[4762]: I0217 18:06:35.216661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:38 crc kubenswrapper[4762]: E0217 18:06:38.915798 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-storage-0" podUID="ae866fa5-748d-4935-a3d2-2fe08bc9693f" Feb 17 18:06:39 crc kubenswrapper[4762]: I0217 18:06:39.062007 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:06:41 crc kubenswrapper[4762]: E0217 18:06:41.281649 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" podUID="e576e3fe-21e1-4867-adcc-bb586e3a5921" Feb 17 18:06:42 crc kubenswrapper[4762]: I0217 18:06:42.079989 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:06:43 crc kubenswrapper[4762]: I0217 18:06:43.976511 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:06:43 crc kubenswrapper[4762]: E0217 18:06:43.976859 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:06:43 crc kubenswrapper[4762]: E0217 18:06:43.977246 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:06:43 crc kubenswrapper[4762]: E0217 18:06:43.977356 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:08:45.977324265 +0000 UTC m=+1277.622242315 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:06:46 crc kubenswrapper[4762]: I0217 18:06:46.410497 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:06:46 crc kubenswrapper[4762]: E0217 18:06:46.410748 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:06:46 crc kubenswrapper[4762]: E0217 18:06:46.411454 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:06:46 crc kubenswrapper[4762]: E0217 18:06:46.411523 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:08:48.411501446 +0000 UTC m=+1280.056419456 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.196730 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tdkb5"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.210117 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tdkb5"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.267536 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.268109 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-log" containerID="cri-o://4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b" gracePeriod=30 Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.268684 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-httpd" containerID="cri-o://6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9" gracePeriod=30 Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.278402 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.278765 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-log" containerID="cri-o://25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea" gracePeriod=30 Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.278930 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-httpd" containerID="cri-o://421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde" gracePeriod=30 Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.317741 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glanceeb6d-account-delete-87d6c"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.318561 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.338248 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceeb6d-account-delete-87d6c"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.401973 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.402018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jh4b\" (UniqueName: \"kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.417024 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.417269 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="b9827e91-a646-4485-9117-e72e23035b7c" containerName="openstackclient" containerID="cri-o://17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85" gracePeriod=30 Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.503756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.504030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jh4b\" (UniqueName: \"kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.504743 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.524358 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jh4b\" (UniqueName: \"kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b\") pod \"glanceeb6d-account-delete-87d6c\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.648946 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.835484 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.910442 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config\") pod \"b9827e91-a646-4485-9117-e72e23035b7c\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.910636 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4sbp\" (UniqueName: \"kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp\") pod \"b9827e91-a646-4485-9117-e72e23035b7c\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.910806 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret\") pod \"b9827e91-a646-4485-9117-e72e23035b7c\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.910862 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts\") pod \"b9827e91-a646-4485-9117-e72e23035b7c\" (UID: \"b9827e91-a646-4485-9117-e72e23035b7c\") " Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.912953 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "b9827e91-a646-4485-9117-e72e23035b7c" (UID: "b9827e91-a646-4485-9117-e72e23035b7c"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.918851 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp" (OuterVolumeSpecName: "kube-api-access-j4sbp") pod "b9827e91-a646-4485-9117-e72e23035b7c" (UID: "b9827e91-a646-4485-9117-e72e23035b7c"). InnerVolumeSpecName "kube-api-access-j4sbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.931334 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b9827e91-a646-4485-9117-e72e23035b7c" (UID: "b9827e91-a646-4485-9117-e72e23035b7c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:52 crc kubenswrapper[4762]: I0217 18:06:52.942122 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b9827e91-a646-4485-9117-e72e23035b7c" (UID: "b9827e91-a646-4485-9117-e72e23035b7c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.012763 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4sbp\" (UniqueName: \"kubernetes.io/projected/b9827e91-a646-4485-9117-e72e23035b7c-kube-api-access-j4sbp\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.012809 4762 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.012825 4762 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.012839 4762 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9827e91-a646-4485-9117-e72e23035b7c-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.052696 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20eb84c0-e935-4918-9ced-8bff7e0a4245" path="/var/lib/kubelet/pods/20eb84c0-e935-4918-9ced-8bff7e0a4245/volumes" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.146298 4762 generic.go:334] "Generic (PLEG): container finished" podID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerID="4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b" exitCode=143 Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.146362 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerDied","Data":"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b"} Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.148220 4762 generic.go:334] "Generic (PLEG): container finished" podID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerID="25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea" exitCode=143 Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.148277 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerDied","Data":"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea"} Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.149667 4762 generic.go:334] "Generic (PLEG): container finished" podID="b9827e91-a646-4485-9117-e72e23035b7c" containerID="17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85" exitCode=143 Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.149712 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b9827e91-a646-4485-9117-e72e23035b7c","Type":"ContainerDied","Data":"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85"} Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.149751 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b9827e91-a646-4485-9117-e72e23035b7c","Type":"ContainerDied","Data":"b63a5ed1d547fd7dd9af6386e8b7424ae0e90db1481aa17378647623910e06aa"} Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.149769 4762 scope.go:117] "RemoveContainer" containerID="17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.149718 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.168044 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.172528 4762 scope.go:117] "RemoveContainer" containerID="17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85" Feb 17 18:06:53 crc kubenswrapper[4762]: E0217 18:06:53.172906 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85\": container with ID starting with 17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85 not found: ID does not exist" containerID="17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.172945 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85"} err="failed to get container status \"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85\": rpc error: code = NotFound desc = could not find container \"17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85\": container with ID starting with 17bd1e8dba9f99ad11dd75d2a43955c79eebfa5515f55a16094155ff734f3a85 not found: ID does not exist" Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.179676 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:06:53 crc kubenswrapper[4762]: I0217 18:06:53.198755 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceeb6d-account-delete-87d6c"] Feb 17 18:06:54 crc kubenswrapper[4762]: I0217 18:06:54.158260 4762 generic.go:334] "Generic (PLEG): container finished" podID="78347175-f74d-43d6-8cd2-b17fabbd5f27" containerID="15118eab78bcab3ac2650f43733287003bf13d265d2829a514a170080991b346" exitCode=0 Feb 17 18:06:54 crc kubenswrapper[4762]: I0217 18:06:54.158324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" event={"ID":"78347175-f74d-43d6-8cd2-b17fabbd5f27","Type":"ContainerDied","Data":"15118eab78bcab3ac2650f43733287003bf13d265d2829a514a170080991b346"} Feb 17 18:06:54 crc kubenswrapper[4762]: I0217 18:06:54.158564 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" event={"ID":"78347175-f74d-43d6-8cd2-b17fabbd5f27","Type":"ContainerStarted","Data":"7b8c292a103f237f3a16f5b913e34aa01f1cad1fbbf916a378d18289f9f4996a"} Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.044147 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9827e91-a646-4485-9117-e72e23035b7c" path="/var/lib/kubelet/pods/b9827e91-a646-4485-9117-e72e23035b7c/volumes" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.434427 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.436693 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.103:9292/healthcheck\": read tcp 10.217.0.2:55472->10.217.0.103:9292: read: connection reset by peer" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.436698 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.103:9292/healthcheck\": read tcp 10.217.0.2:55470->10.217.0.103:9292: read: connection reset by peer" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.552807 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jh4b\" (UniqueName: \"kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b\") pod \"78347175-f74d-43d6-8cd2-b17fabbd5f27\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.552859 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts\") pod \"78347175-f74d-43d6-8cd2-b17fabbd5f27\" (UID: \"78347175-f74d-43d6-8cd2-b17fabbd5f27\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.555458 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78347175-f74d-43d6-8cd2-b17fabbd5f27" (UID: "78347175-f74d-43d6-8cd2-b17fabbd5f27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.561733 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b" (OuterVolumeSpecName: "kube-api-access-2jh4b") pod "78347175-f74d-43d6-8cd2-b17fabbd5f27" (UID: "78347175-f74d-43d6-8cd2-b17fabbd5f27"). InnerVolumeSpecName "kube-api-access-2jh4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.654815 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jh4b\" (UniqueName: \"kubernetes.io/projected/78347175-f74d-43d6-8cd2-b17fabbd5f27-kube-api-access-2jh4b\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.654850 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78347175-f74d-43d6-8cd2-b17fabbd5f27-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.779273 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.781690 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858217 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxfl\" (UniqueName: \"kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858259 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858289 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858310 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858338 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys" (OuterVolumeSpecName: "sys") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858391 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858411 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858457 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858475 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858455 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858503 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858534 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858472 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev" (OuterVolumeSpecName: "dev") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858492 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run" (OuterVolumeSpecName: "run") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858518 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858535 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858559 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858577 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858594 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858640 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858667 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858688 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858712 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858737 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858763 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858815 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858840 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjk9b\" (UniqueName: \"kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858882 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858892 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs" (OuterVolumeSpecName: "logs") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858903 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858922 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme\") pod \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\" (UID: \"47b87e3b-550c-4f8b-ba57-cab19b9b5111\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858954 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data\") pod \"cfeff88c-b91a-496f-a4da-f82c9eb10472\" (UID: \"cfeff88c-b91a-496f-a4da-f82c9eb10472\") " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858948 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs" (OuterVolumeSpecName: "logs") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.858995 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys" (OuterVolumeSpecName: "sys") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859238 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859268 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859266 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859306 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev" (OuterVolumeSpecName: "dev") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859295 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859350 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859396 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859424 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859436 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859477 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859480 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run" (OuterVolumeSpecName: "run") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859492 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859504 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859514 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859527 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859690 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859713 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.859747 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.861553 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl" (OuterVolumeSpecName: "kube-api-access-djxfl") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "kube-api-access-djxfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.862067 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b" (OuterVolumeSpecName: "kube-api-access-mjk9b") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "kube-api-access-mjk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.862907 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.863020 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.864353 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.864603 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.864609 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts" (OuterVolumeSpecName: "scripts") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.872576 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts" (OuterVolumeSpecName: "scripts") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.897347 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data" (OuterVolumeSpecName: "config-data") pod "47b87e3b-550c-4f8b-ba57-cab19b9b5111" (UID: "47b87e3b-550c-4f8b-ba57-cab19b9b5111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.904872 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data" (OuterVolumeSpecName: "config-data") pod "cfeff88c-b91a-496f-a4da-f82c9eb10472" (UID: "cfeff88c-b91a-496f-a4da-f82c9eb10472"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961193 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961226 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961243 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961255 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961268 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961278 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961295 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961309 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961321 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47b87e3b-550c-4f8b-ba57-cab19b9b5111-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961332 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961343 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961357 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjk9b\" (UniqueName: \"kubernetes.io/projected/cfeff88c-b91a-496f-a4da-f82c9eb10472-kube-api-access-mjk9b\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961368 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfeff88c-b91a-496f-a4da-f82c9eb10472-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961379 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfeff88c-b91a-496f-a4da-f82c9eb10472-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961388 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47b87e3b-550c-4f8b-ba57-cab19b9b5111-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961398 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfeff88c-b91a-496f-a4da-f82c9eb10472-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961409 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxfl\" (UniqueName: \"kubernetes.io/projected/47b87e3b-550c-4f8b-ba57-cab19b9b5111-kube-api-access-djxfl\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.961418 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b87e3b-550c-4f8b-ba57-cab19b9b5111-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.973300 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.973391 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.975974 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:06:55 crc kubenswrapper[4762]: I0217 18:06:55.976405 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.062957 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.062990 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.063001 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.063008 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.172990 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" event={"ID":"78347175-f74d-43d6-8cd2-b17fabbd5f27","Type":"ContainerDied","Data":"7b8c292a103f237f3a16f5b913e34aa01f1cad1fbbf916a378d18289f9f4996a"} Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.173039 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b8c292a103f237f3a16f5b913e34aa01f1cad1fbbf916a378d18289f9f4996a" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.173039 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceeb6d-account-delete-87d6c" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.175810 4762 generic.go:334] "Generic (PLEG): container finished" podID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerID="6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9" exitCode=0 Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.175847 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.175888 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerDied","Data":"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9"} Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.175945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"47b87e3b-550c-4f8b-ba57-cab19b9b5111","Type":"ContainerDied","Data":"606c3e8dd073dc0ed6636a1f65d7c09dedb0bb705624c6332e1870554029abb3"} Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.175978 4762 scope.go:117] "RemoveContainer" containerID="6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.178811 4762 generic.go:334] "Generic (PLEG): container finished" podID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerID="421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde" exitCode=0 Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.178849 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerDied","Data":"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde"} Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.178868 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.178880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cfeff88c-b91a-496f-a4da-f82c9eb10472","Type":"ContainerDied","Data":"67d8e2e96f418b175a0cb62fc3605285d30ef1862f982e8a02a13e6c86cf2625"} Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.197122 4762 scope.go:117] "RemoveContainer" containerID="4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.214104 4762 scope.go:117] "RemoveContainer" containerID="6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9" Feb 17 18:06:56 crc kubenswrapper[4762]: E0217 18:06:56.214558 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9\": container with ID starting with 6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9 not found: ID does not exist" containerID="6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.214594 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9"} err="failed to get container status \"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9\": rpc error: code = NotFound desc = could not find container \"6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9\": container with ID starting with 6a43b34440ff5693f5c2d7034999a3904845c63291cd098c40fd593cfd860ba9 not found: ID does not exist" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.214639 4762 scope.go:117] "RemoveContainer" containerID="4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b" Feb 17 18:06:56 crc kubenswrapper[4762]: E0217 18:06:56.215421 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b\": container with ID starting with 4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b not found: ID does not exist" containerID="4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.215483 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b"} err="failed to get container status \"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b\": rpc error: code = NotFound desc = could not find container \"4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b\": container with ID starting with 4cee263b45955814b4a5bff7e225e190422a2dbe584fffe5587d8a6cf1808f2b not found: ID does not exist" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.215512 4762 scope.go:117] "RemoveContainer" containerID="421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.216212 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.230842 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.243754 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.244719 4762 scope.go:117] "RemoveContainer" containerID="25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.249759 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.258646 4762 scope.go:117] "RemoveContainer" containerID="421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde" Feb 17 18:06:56 crc kubenswrapper[4762]: E0217 18:06:56.259020 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde\": container with ID starting with 421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde not found: ID does not exist" containerID="421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.259052 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde"} err="failed to get container status \"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde\": rpc error: code = NotFound desc = could not find container \"421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde\": container with ID starting with 421dd75fab2a739c16fe7090054484eb44fc86ea392c5a0ea266fcbfe1b06bde not found: ID does not exist" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.259076 4762 scope.go:117] "RemoveContainer" containerID="25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea" Feb 17 18:06:56 crc kubenswrapper[4762]: E0217 18:06:56.259330 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea\": container with ID starting with 25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea not found: ID does not exist" containerID="25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea" Feb 17 18:06:56 crc kubenswrapper[4762]: I0217 18:06:56.259373 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea"} err="failed to get container status \"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea\": rpc error: code = NotFound desc = could not find container \"25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea\": container with ID starting with 25f413e8de3d4c7e220e3ffe4ae64e3f273b543a3a26d529f230af15b9ab07ea not found: ID does not exist" Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.046713 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" path="/var/lib/kubelet/pods/47b87e3b-550c-4f8b-ba57-cab19b9b5111/volumes" Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.047924 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" path="/var/lib/kubelet/pods/cfeff88c-b91a-496f-a4da-f82c9eb10472/volumes" Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.328887 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-zd56k"] Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.335886 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-zd56k"] Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.342251 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glanceeb6d-account-delete-87d6c"] Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.348754 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd"] Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.354810 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-eb6d-account-create-update-ltlzd"] Feb 17 18:06:57 crc kubenswrapper[4762]: I0217 18:06:57.360794 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glanceeb6d-account-delete-87d6c"] Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281599 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-c2jmn"] Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.281900 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281919 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.281935 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281942 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.281954 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281960 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.281971 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281976 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.281988 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78347175-f74d-43d6-8cd2-b17fabbd5f27" containerName="mariadb-account-delete" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.281994 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78347175-f74d-43d6-8cd2-b17fabbd5f27" containerName="mariadb-account-delete" Feb 17 18:06:58 crc kubenswrapper[4762]: E0217 18:06:58.282003 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9827e91-a646-4485-9117-e72e23035b7c" containerName="openstackclient" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282008 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9827e91-a646-4485-9117-e72e23035b7c" containerName="openstackclient" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282127 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282138 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfeff88c-b91a-496f-a4da-f82c9eb10472" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282147 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78347175-f74d-43d6-8cd2-b17fabbd5f27" containerName="mariadb-account-delete" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282159 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-log" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282168 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9827e91-a646-4485-9117-e72e23035b7c" containerName="openstackclient" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282176 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b87e3b-550c-4f8b-ba57-cab19b9b5111" containerName="glance-httpd" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.282596 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.294647 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5d57-account-create-update-wh7zx"] Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.296178 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.297591 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.302085 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-c2jmn"] Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.309340 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5d57-account-create-update-wh7zx"] Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.393219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv7dr\" (UniqueName: \"kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.393314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8dh\" (UniqueName: \"kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.393385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.393408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.495192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.495257 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.495344 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv7dr\" (UniqueName: \"kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.495380 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8dh\" (UniqueName: \"kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.496361 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.496361 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.515676 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv7dr\" (UniqueName: \"kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr\") pod \"glance-5d57-account-create-update-wh7zx\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.515741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8dh\" (UniqueName: \"kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh\") pod \"glance-db-create-c2jmn\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.599949 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:06:58 crc kubenswrapper[4762]: I0217 18:06:58.625087 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.020335 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-c2jmn"] Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.044536 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c1aaaf-96e7-4356-9107-7adcd9cad2df" path="/var/lib/kubelet/pods/39c1aaaf-96e7-4356-9107-7adcd9cad2df/volumes" Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.045516 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78347175-f74d-43d6-8cd2-b17fabbd5f27" path="/var/lib/kubelet/pods/78347175-f74d-43d6-8cd2-b17fabbd5f27/volumes" Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.046064 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0a0e07-c833-44d2-bb21-7ff75db80be1" path="/var/lib/kubelet/pods/ed0a0e07-c833-44d2-bb21-7ff75db80be1/volumes" Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.101895 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5d57-account-create-update-wh7zx"] Feb 17 18:06:59 crc kubenswrapper[4762]: W0217 18:06:59.117139 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ba5ea58_4b97_474b_b895_df3c1ee3a02e.slice/crio-51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017 WatchSource:0}: Error finding container 51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017: Status 404 returned error can't find the container with id 51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017 Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.207771 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" event={"ID":"4ba5ea58-4b97-474b-b895-df3c1ee3a02e","Type":"ContainerStarted","Data":"51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017"} Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.208908 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-c2jmn" event={"ID":"2f20aed1-a1f6-472c-b804-24e295bc6e18","Type":"ContainerStarted","Data":"1934288ccee311898ca6cad67c02aabfbdb12d8996abd531912d73a8b49217d0"} Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.208931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-c2jmn" event={"ID":"2f20aed1-a1f6-472c-b804-24e295bc6e18","Type":"ContainerStarted","Data":"44099c0d21c3e2badf5ff6667e3f887c3918eadaba40880ca614211603bb2b0b"} Feb 17 18:06:59 crc kubenswrapper[4762]: I0217 18:06:59.247037 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-c2jmn" podStartSLOduration=1.247017059 podStartE2EDuration="1.247017059s" podCreationTimestamp="2026-02-17 18:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:59.241505863 +0000 UTC m=+1170.886423873" watchObservedRunningTime="2026-02-17 18:06:59.247017059 +0000 UTC m=+1170.891935069" Feb 17 18:07:00 crc kubenswrapper[4762]: I0217 18:07:00.217456 4762 generic.go:334] "Generic (PLEG): container finished" podID="2f20aed1-a1f6-472c-b804-24e295bc6e18" containerID="1934288ccee311898ca6cad67c02aabfbdb12d8996abd531912d73a8b49217d0" exitCode=0 Feb 17 18:07:00 crc kubenswrapper[4762]: I0217 18:07:00.217533 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-c2jmn" event={"ID":"2f20aed1-a1f6-472c-b804-24e295bc6e18","Type":"ContainerDied","Data":"1934288ccee311898ca6cad67c02aabfbdb12d8996abd531912d73a8b49217d0"} Feb 17 18:07:00 crc kubenswrapper[4762]: I0217 18:07:00.220976 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ba5ea58-4b97-474b-b895-df3c1ee3a02e" containerID="4d85cb21881277c951bf6629c2b4569b90ea2f745c45c732745891e793fbbef5" exitCode=0 Feb 17 18:07:00 crc kubenswrapper[4762]: I0217 18:07:00.221025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" event={"ID":"4ba5ea58-4b97-474b-b895-df3c1ee3a02e","Type":"ContainerDied","Data":"4d85cb21881277c951bf6629c2b4569b90ea2f745c45c732745891e793fbbef5"} Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.541057 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.550406 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.642815 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts\") pod \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.642885 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr8dh\" (UniqueName: \"kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh\") pod \"2f20aed1-a1f6-472c-b804-24e295bc6e18\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.642912 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts\") pod \"2f20aed1-a1f6-472c-b804-24e295bc6e18\" (UID: \"2f20aed1-a1f6-472c-b804-24e295bc6e18\") " Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.643053 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv7dr\" (UniqueName: \"kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr\") pod \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\" (UID: \"4ba5ea58-4b97-474b-b895-df3c1ee3a02e\") " Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.643392 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ba5ea58-4b97-474b-b895-df3c1ee3a02e" (UID: "4ba5ea58-4b97-474b-b895-df3c1ee3a02e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.643425 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f20aed1-a1f6-472c-b804-24e295bc6e18" (UID: "2f20aed1-a1f6-472c-b804-24e295bc6e18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.648804 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr" (OuterVolumeSpecName: "kube-api-access-wv7dr") pod "4ba5ea58-4b97-474b-b895-df3c1ee3a02e" (UID: "4ba5ea58-4b97-474b-b895-df3c1ee3a02e"). InnerVolumeSpecName "kube-api-access-wv7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.648842 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh" (OuterVolumeSpecName: "kube-api-access-zr8dh") pod "2f20aed1-a1f6-472c-b804-24e295bc6e18" (UID: "2f20aed1-a1f6-472c-b804-24e295bc6e18"). InnerVolumeSpecName "kube-api-access-zr8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.744519 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv7dr\" (UniqueName: \"kubernetes.io/projected/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-kube-api-access-wv7dr\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.744557 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba5ea58-4b97-474b-b895-df3c1ee3a02e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.744568 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr8dh\" (UniqueName: \"kubernetes.io/projected/2f20aed1-a1f6-472c-b804-24e295bc6e18-kube-api-access-zr8dh\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:01 crc kubenswrapper[4762]: I0217 18:07:01.744576 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f20aed1-a1f6-472c-b804-24e295bc6e18-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.234746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" event={"ID":"4ba5ea58-4b97-474b-b895-df3c1ee3a02e","Type":"ContainerDied","Data":"51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017"} Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.234784 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51658efdece78b840186aa48ae86041a40c099f731b4dca5cd95131d717d2017" Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.234797 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5d57-account-create-update-wh7zx" Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.236574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-c2jmn" event={"ID":"2f20aed1-a1f6-472c-b804-24e295bc6e18","Type":"ContainerDied","Data":"44099c0d21c3e2badf5ff6667e3f887c3918eadaba40880ca614211603bb2b0b"} Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.236638 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44099c0d21c3e2badf5ff6667e3f887c3918eadaba40880ca614211603bb2b0b" Feb 17 18:07:02 crc kubenswrapper[4762]: I0217 18:07:02.236659 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-c2jmn" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.514388 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-r29k6"] Feb 17 18:07:03 crc kubenswrapper[4762]: E0217 18:07:03.516143 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba5ea58-4b97-474b-b895-df3c1ee3a02e" containerName="mariadb-account-create-update" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.516240 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba5ea58-4b97-474b-b895-df3c1ee3a02e" containerName="mariadb-account-create-update" Feb 17 18:07:03 crc kubenswrapper[4762]: E0217 18:07:03.516325 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f20aed1-a1f6-472c-b804-24e295bc6e18" containerName="mariadb-database-create" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.516391 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f20aed1-a1f6-472c-b804-24e295bc6e18" containerName="mariadb-database-create" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.516872 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f20aed1-a1f6-472c-b804-24e295bc6e18" containerName="mariadb-database-create" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.516930 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba5ea58-4b97-474b-b895-df3c1ee3a02e" containerName="mariadb-account-create-update" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.518193 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.520501 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.520693 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.521292 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-ht24h" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.547318 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-r29k6"] Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.571091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.571192 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wjxp\" (UniqueName: \"kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.571311 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.571358 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.672784 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.672861 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wjxp\" (UniqueName: \"kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.672975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.673011 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.680288 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.680337 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.680603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.693301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wjxp\" (UniqueName: \"kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp\") pod \"glance-db-sync-r29k6\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:03 crc kubenswrapper[4762]: I0217 18:07:03.842859 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.296024 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-r29k6"] Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.558966 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.559039 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.559255 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.560020 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:07:04 crc kubenswrapper[4762]: I0217 18:07:04.560094 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6" gracePeriod=600 Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.260518 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6" exitCode=0 Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.261150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6"} Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.261193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885"} Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.261217 4762 scope.go:117] "RemoveContainer" containerID="f6d7169d5319fd48ce328413c1944d85701526c1b8e50744c099c2e1b3abb5de" Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.262974 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-r29k6" event={"ID":"65631b37-4a99-4f5a-a9c7-26c271e0a1c3","Type":"ContainerStarted","Data":"5141b833e80fc94fe26eca5a9a02fe35e13356b4349a60200d849800087da384"} Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.263012 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-r29k6" event={"ID":"65631b37-4a99-4f5a-a9c7-26c271e0a1c3","Type":"ContainerStarted","Data":"13c89130972daa3acd6d754bd447c3fbc3074504497b5257864f81c940ddfe4c"} Feb 17 18:07:05 crc kubenswrapper[4762]: I0217 18:07:05.299751 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-r29k6" podStartSLOduration=2.299704624 podStartE2EDuration="2.299704624s" podCreationTimestamp="2026-02-17 18:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:05.298143219 +0000 UTC m=+1176.943061249" watchObservedRunningTime="2026-02-17 18:07:05.299704624 +0000 UTC m=+1176.944622634" Feb 17 18:07:08 crc kubenswrapper[4762]: I0217 18:07:08.292437 4762 generic.go:334] "Generic (PLEG): container finished" podID="65631b37-4a99-4f5a-a9c7-26c271e0a1c3" containerID="5141b833e80fc94fe26eca5a9a02fe35e13356b4349a60200d849800087da384" exitCode=0 Feb 17 18:07:08 crc kubenswrapper[4762]: I0217 18:07:08.292561 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-r29k6" event={"ID":"65631b37-4a99-4f5a-a9c7-26c271e0a1c3","Type":"ContainerDied","Data":"5141b833e80fc94fe26eca5a9a02fe35e13356b4349a60200d849800087da384"} Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.552757 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.693595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data\") pod \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.693702 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wjxp\" (UniqueName: \"kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp\") pod \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.693740 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data\") pod \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.693795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle\") pod \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\" (UID: \"65631b37-4a99-4f5a-a9c7-26c271e0a1c3\") " Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.700211 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp" (OuterVolumeSpecName: "kube-api-access-8wjxp") pod "65631b37-4a99-4f5a-a9c7-26c271e0a1c3" (UID: "65631b37-4a99-4f5a-a9c7-26c271e0a1c3"). InnerVolumeSpecName "kube-api-access-8wjxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.701065 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "65631b37-4a99-4f5a-a9c7-26c271e0a1c3" (UID: "65631b37-4a99-4f5a-a9c7-26c271e0a1c3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.725151 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65631b37-4a99-4f5a-a9c7-26c271e0a1c3" (UID: "65631b37-4a99-4f5a-a9c7-26c271e0a1c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.735773 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data" (OuterVolumeSpecName: "config-data") pod "65631b37-4a99-4f5a-a9c7-26c271e0a1c3" (UID: "65631b37-4a99-4f5a-a9c7-26c271e0a1c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.796105 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wjxp\" (UniqueName: \"kubernetes.io/projected/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-kube-api-access-8wjxp\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.796155 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.796168 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:09 crc kubenswrapper[4762]: I0217 18:07:09.796180 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65631b37-4a99-4f5a-a9c7-26c271e0a1c3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:10 crc kubenswrapper[4762]: I0217 18:07:10.307597 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-r29k6" event={"ID":"65631b37-4a99-4f5a-a9c7-26c271e0a1c3","Type":"ContainerDied","Data":"13c89130972daa3acd6d754bd447c3fbc3074504497b5257864f81c940ddfe4c"} Feb 17 18:07:10 crc kubenswrapper[4762]: I0217 18:07:10.307663 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c89130972daa3acd6d754bd447c3fbc3074504497b5257864f81c940ddfe4c" Feb 17 18:07:10 crc kubenswrapper[4762]: I0217 18:07:10.307663 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-r29k6" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.609998 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:11 crc kubenswrapper[4762]: E0217 18:07:11.610582 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65631b37-4a99-4f5a-a9c7-26c271e0a1c3" containerName="glance-db-sync" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.610594 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="65631b37-4a99-4f5a-a9c7-26c271e0a1c3" containerName="glance-db-sync" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.610745 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="65631b37-4a99-4f5a-a9c7-26c271e0a1c3" containerName="glance-db-sync" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.611410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.614462 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.615725 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.616299 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.616546 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.616736 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-ht24h" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.616883 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.624495 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.779405 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:11 crc kubenswrapper[4762]: E0217 18:07:11.779958 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-msr9r logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/glance-default-single-0" podUID="928a64dc-51e1-47bb-9087-26b7b3316f1c" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808123 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808243 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msr9r\" (UniqueName: \"kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808289 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808326 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808357 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808509 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808595 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808643 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.808689 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909668 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msr9r\" (UniqueName: \"kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909821 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909855 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.909946 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.910798 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.910986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.911031 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.915005 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.915122 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.915562 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.916273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.929160 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.932787 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:11 crc kubenswrapper[4762]: I0217 18:07:11.943193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msr9r\" (UniqueName: \"kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r\") pod \"glance-default-single-0\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.440089 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.449966 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623189 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623241 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623291 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623327 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msr9r\" (UniqueName: \"kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623371 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623394 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623428 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623455 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts\") pod \"928a64dc-51e1-47bb-9087-26b7b3316f1c\" (UID: \"928a64dc-51e1-47bb-9087-26b7b3316f1c\") " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623742 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.623774 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs" (OuterVolumeSpecName: "logs") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.628454 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.628801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data" (OuterVolumeSpecName: "config-data") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.629256 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.630022 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.633236 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r" (OuterVolumeSpecName: "kube-api-access-msr9r") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "kube-api-access-msr9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.633304 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.633372 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts" (OuterVolumeSpecName: "scripts") pod "928a64dc-51e1-47bb-9087-26b7b3316f1c" (UID: "928a64dc-51e1-47bb-9087-26b7b3316f1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724823 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724857 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724866 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724875 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/928a64dc-51e1-47bb-9087-26b7b3316f1c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724887 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724901 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724912 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msr9r\" (UniqueName: \"kubernetes.io/projected/928a64dc-51e1-47bb-9087-26b7b3316f1c-kube-api-access-msr9r\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724920 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928a64dc-51e1-47bb-9087-26b7b3316f1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.724941 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.737472 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:07:12 crc kubenswrapper[4762]: I0217 18:07:12.826536 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.445892 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.521686 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.523048 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.563522 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.564843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.567383 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.567496 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.567383 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.568932 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.569374 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-ht24h" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.569447 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.636448 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.738919 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.738966 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.738992 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739021 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739077 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739110 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhkz\" (UniqueName: \"kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739184 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739215 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.739329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840313 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840342 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhkz\" (UniqueName: \"kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840430 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840486 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840518 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840538 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.840803 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.841042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.841058 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.846484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.846505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.846878 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.853192 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.853406 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.858577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhkz\" (UniqueName: \"kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.859943 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:13 crc kubenswrapper[4762]: I0217 18:07:13.879675 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:14 crc kubenswrapper[4762]: I0217 18:07:14.281846 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:14 crc kubenswrapper[4762]: I0217 18:07:14.453086 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerStarted","Data":"fba17e53140c1bb8dc73fd5b3959b7b35e4729785663dbde75679372bb81f26c"} Feb 17 18:07:15 crc kubenswrapper[4762]: I0217 18:07:15.045601 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928a64dc-51e1-47bb-9087-26b7b3316f1c" path="/var/lib/kubelet/pods/928a64dc-51e1-47bb-9087-26b7b3316f1c/volumes" Feb 17 18:07:15 crc kubenswrapper[4762]: I0217 18:07:15.462875 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerStarted","Data":"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478"} Feb 17 18:07:15 crc kubenswrapper[4762]: I0217 18:07:15.463219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerStarted","Data":"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec"} Feb 17 18:07:15 crc kubenswrapper[4762]: I0217 18:07:15.487612 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.487590207 podStartE2EDuration="2.487590207s" podCreationTimestamp="2026-02-17 18:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:15.484613262 +0000 UTC m=+1187.129531272" watchObservedRunningTime="2026-02-17 18:07:15.487590207 +0000 UTC m=+1187.132508217" Feb 17 18:07:23 crc kubenswrapper[4762]: I0217 18:07:23.880982 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:23 crc kubenswrapper[4762]: I0217 18:07:23.884685 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:23 crc kubenswrapper[4762]: I0217 18:07:23.907688 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:23 crc kubenswrapper[4762]: I0217 18:07:23.920718 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:24 crc kubenswrapper[4762]: I0217 18:07:24.533867 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:24 crc kubenswrapper[4762]: I0217 18:07:24.533909 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:26 crc kubenswrapper[4762]: I0217 18:07:26.483079 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:26 crc kubenswrapper[4762]: I0217 18:07:26.544798 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:07:26 crc kubenswrapper[4762]: I0217 18:07:26.589653 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.301522 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-r29k6"] Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.307256 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-r29k6"] Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.356298 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance5d57-account-delete-4xk79"] Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.357910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.363382 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5d57-account-delete-4xk79"] Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.385120 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.454821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4sw\" (UniqueName: \"kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.454883 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.550508 4762 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/glance-default-single-0" secret="" err="secret \"glance-glance-dockercfg-ht24h\" not found" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.556719 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4sw\" (UniqueName: \"kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.556768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.557782 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.583769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4sw\" (UniqueName: \"kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw\") pod \"glance5d57-account-delete-4xk79\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:27 crc kubenswrapper[4762]: E0217 18:07:27.592138 4762 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/glance-glance-default-single-0: PVC is being deleted" pod="glance-kuttl-tests/glance-default-single-0" volumeName="glance" Feb 17 18:07:27 crc kubenswrapper[4762]: I0217 18:07:27.680156 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.082312 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5d57-account-delete-4xk79"] Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.559236 4762 generic.go:334] "Generic (PLEG): container finished" podID="19a75058-a99a-4f37-a8bd-237c0d82cbc1" containerID="9fa3f922e6b52efd7355b0422bd544df3d08e7f319a51a832f1515640f925378" exitCode=0 Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.559294 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" event={"ID":"19a75058-a99a-4f37-a8bd-237c0d82cbc1","Type":"ContainerDied","Data":"9fa3f922e6b52efd7355b0422bd544df3d08e7f319a51a832f1515640f925378"} Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.559585 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" event={"ID":"19a75058-a99a-4f37-a8bd-237c0d82cbc1","Type":"ContainerStarted","Data":"431b467e3ba5e23b2bd84ef6c8ccca8afc5cd59f5333c478d53f5ac96a653ebf"} Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.559791 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-log" containerID="cri-o://e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec" gracePeriod=30 Feb 17 18:07:28 crc kubenswrapper[4762]: I0217 18:07:28.559834 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-httpd" containerID="cri-o://18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478" gracePeriod=30 Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.046229 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65631b37-4a99-4f5a-a9c7-26c271e0a1c3" path="/var/lib/kubelet/pods/65631b37-4a99-4f5a-a9c7-26c271e0a1c3/volumes" Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.567923 4762 generic.go:334] "Generic (PLEG): container finished" podID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerID="e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec" exitCode=143 Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.568032 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerDied","Data":"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec"} Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.851397 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.994321 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4sw\" (UniqueName: \"kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw\") pod \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.994430 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts\") pod \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\" (UID: \"19a75058-a99a-4f37-a8bd-237c0d82cbc1\") " Feb 17 18:07:29 crc kubenswrapper[4762]: I0217 18:07:29.995175 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19a75058-a99a-4f37-a8bd-237c0d82cbc1" (UID: "19a75058-a99a-4f37-a8bd-237c0d82cbc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.001265 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw" (OuterVolumeSpecName: "kube-api-access-qg4sw") pod "19a75058-a99a-4f37-a8bd-237c0d82cbc1" (UID: "19a75058-a99a-4f37-a8bd-237c0d82cbc1"). InnerVolumeSpecName "kube-api-access-qg4sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.096392 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a75058-a99a-4f37-a8bd-237c0d82cbc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.097001 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4sw\" (UniqueName: \"kubernetes.io/projected/19a75058-a99a-4f37-a8bd-237c0d82cbc1-kube-api-access-qg4sw\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.576590 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" event={"ID":"19a75058-a99a-4f37-a8bd-237c0d82cbc1","Type":"ContainerDied","Data":"431b467e3ba5e23b2bd84ef6c8ccca8afc5cd59f5333c478d53f5ac96a653ebf"} Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.576646 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431b467e3ba5e23b2bd84ef6c8ccca8afc5cd59f5333c478d53f5ac96a653ebf" Feb 17 18:07:30 crc kubenswrapper[4762]: I0217 18:07:30.576669 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5d57-account-delete-4xk79" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.112228 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.228891 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.228982 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229066 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229152 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229181 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229580 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs" (OuterVolumeSpecName: "logs") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229754 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229885 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fhkz\" (UniqueName: \"kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.229968 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts\") pod \"a941e309-e15c-4890-abd6-7a44861cbbe9\" (UID: \"a941e309-e15c-4890-abd6-7a44861cbbe9\") " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.230359 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.230381 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a941e309-e15c-4890-abd6-7a44861cbbe9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.233465 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.234130 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz" (OuterVolumeSpecName: "kube-api-access-9fhkz") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "kube-api-access-9fhkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.234716 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts" (OuterVolumeSpecName: "scripts") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.263528 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data" (OuterVolumeSpecName: "config-data") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.268082 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.268155 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.269075 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a941e309-e15c-4890-abd6-7a44861cbbe9" (UID: "a941e309-e15c-4890-abd6-7a44861cbbe9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331278 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331329 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331339 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331352 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331362 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fhkz\" (UniqueName: \"kubernetes.io/projected/a941e309-e15c-4890-abd6-7a44861cbbe9-kube-api-access-9fhkz\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331370 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.331378 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a941e309-e15c-4890-abd6-7a44861cbbe9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.356520 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.386848 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-c2jmn"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.391537 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-c2jmn"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.395941 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance5d57-account-delete-4xk79"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.401598 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-5d57-account-create-update-wh7zx"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.408491 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance5d57-account-delete-4xk79"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.415469 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-5d57-account-create-update-wh7zx"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.432391 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.590875 4762 generic.go:334] "Generic (PLEG): container finished" podID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerID="18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.590926 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.590925 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerDied","Data":"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478"} Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.591808 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a941e309-e15c-4890-abd6-7a44861cbbe9","Type":"ContainerDied","Data":"fba17e53140c1bb8dc73fd5b3959b7b35e4729785663dbde75679372bb81f26c"} Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.591833 4762 scope.go:117] "RemoveContainer" containerID="18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.612817 4762 scope.go:117] "RemoveContainer" containerID="e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.626485 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.632985 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.637824 4762 scope.go:117] "RemoveContainer" containerID="18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478" Feb 17 18:07:32 crc kubenswrapper[4762]: E0217 18:07:32.639120 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478\": container with ID starting with 18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478 not found: ID does not exist" containerID="18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.639174 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478"} err="failed to get container status \"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478\": rpc error: code = NotFound desc = could not find container \"18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478\": container with ID starting with 18403516ebee7e520e7a4c92f2735e6b806a34e3fc0da43e4a4ad98b0bd20478 not found: ID does not exist" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.639203 4762 scope.go:117] "RemoveContainer" containerID="e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec" Feb 17 18:07:32 crc kubenswrapper[4762]: E0217 18:07:32.639706 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec\": container with ID starting with e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec not found: ID does not exist" containerID="e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec" Feb 17 18:07:32 crc kubenswrapper[4762]: I0217 18:07:32.639749 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec"} err="failed to get container status \"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec\": rpc error: code = NotFound desc = could not find container \"e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec\": container with ID starting with e0ed3f367831c5b20701377c750c23ec417004fd06cbbe5719a29a53e59d0bec not found: ID does not exist" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.047912 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19a75058-a99a-4f37-a8bd-237c0d82cbc1" path="/var/lib/kubelet/pods/19a75058-a99a-4f37-a8bd-237c0d82cbc1/volumes" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.049070 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f20aed1-a1f6-472c-b804-24e295bc6e18" path="/var/lib/kubelet/pods/2f20aed1-a1f6-472c-b804-24e295bc6e18/volumes" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.050336 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba5ea58-4b97-474b-b895-df3c1ee3a02e" path="/var/lib/kubelet/pods/4ba5ea58-4b97-474b-b895-df3c1ee3a02e/volumes" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.051434 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" path="/var/lib/kubelet/pods/a941e309-e15c-4890-abd6-7a44861cbbe9/volumes" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.473937 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-dcvjk"] Feb 17 18:07:33 crc kubenswrapper[4762]: E0217 18:07:33.475293 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-log" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.475381 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-log" Feb 17 18:07:33 crc kubenswrapper[4762]: E0217 18:07:33.475450 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a75058-a99a-4f37-a8bd-237c0d82cbc1" containerName="mariadb-account-delete" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.475508 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a75058-a99a-4f37-a8bd-237c0d82cbc1" containerName="mariadb-account-delete" Feb 17 18:07:33 crc kubenswrapper[4762]: E0217 18:07:33.475579 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-httpd" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.475655 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-httpd" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.475852 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-httpd" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.475974 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a75058-a99a-4f37-a8bd-237c0d82cbc1" containerName="mariadb-account-delete" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.476069 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a941e309-e15c-4890-abd6-7a44861cbbe9" containerName="glance-log" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.476576 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.488077 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh"] Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.489144 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.493940 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.496683 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dcvjk"] Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.517480 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh"] Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.545009 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.545090 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzk7\" (UniqueName: \"kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.646532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.646612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mktw\" (UniqueName: \"kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.646679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.646741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzk7\" (UniqueName: \"kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.648025 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.675782 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzk7\" (UniqueName: \"kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7\") pod \"glance-db-create-dcvjk\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.748191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mktw\" (UniqueName: \"kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.748337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.749020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.767230 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mktw\" (UniqueName: \"kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw\") pod \"glance-d4c8-account-create-update-ntnlh\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.796759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.806795 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:33 crc kubenswrapper[4762]: I0217 18:07:33.994213 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dcvjk"] Feb 17 18:07:34 crc kubenswrapper[4762]: W0217 18:07:34.001105 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e4f362_7d19_48a7_a297_bae1fb8cdf8b.slice/crio-29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5 WatchSource:0}: Error finding container 29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5: Status 404 returned error can't find the container with id 29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5 Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.045056 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh"] Feb 17 18:07:34 crc kubenswrapper[4762]: W0217 18:07:34.046780 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc00b31dd_8e0a_40c6_8761_205f14bf1bde.slice/crio-fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401 WatchSource:0}: Error finding container fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401: Status 404 returned error can't find the container with id fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401 Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.615784 4762 generic.go:334] "Generic (PLEG): container finished" podID="71e4f362-7d19-48a7-a297-bae1fb8cdf8b" containerID="0bd686b0459fa641e36370acbb19207cdbb705a1cd5fc72480a4efa530b44028" exitCode=0 Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.615856 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dcvjk" event={"ID":"71e4f362-7d19-48a7-a297-bae1fb8cdf8b","Type":"ContainerDied","Data":"0bd686b0459fa641e36370acbb19207cdbb705a1cd5fc72480a4efa530b44028"} Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.616172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dcvjk" event={"ID":"71e4f362-7d19-48a7-a297-bae1fb8cdf8b","Type":"ContainerStarted","Data":"29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5"} Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.617919 4762 generic.go:334] "Generic (PLEG): container finished" podID="c00b31dd-8e0a-40c6-8761-205f14bf1bde" containerID="cbde26d443a29e776a74a416e19d59ce7e75ae17968d5a66feab8bcfeaab175b" exitCode=0 Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.617951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" event={"ID":"c00b31dd-8e0a-40c6-8761-205f14bf1bde","Type":"ContainerDied","Data":"cbde26d443a29e776a74a416e19d59ce7e75ae17968d5a66feab8bcfeaab175b"} Feb 17 18:07:34 crc kubenswrapper[4762]: I0217 18:07:34.617965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" event={"ID":"c00b31dd-8e0a-40c6-8761-205f14bf1bde","Type":"ContainerStarted","Data":"fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401"} Feb 17 18:07:35 crc kubenswrapper[4762]: I0217 18:07:35.963871 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:35 crc kubenswrapper[4762]: I0217 18:07:35.968357 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.080280 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjzk7\" (UniqueName: \"kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7\") pod \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.080384 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mktw\" (UniqueName: \"kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw\") pod \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.080535 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts\") pod \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\" (UID: \"c00b31dd-8e0a-40c6-8761-205f14bf1bde\") " Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.080590 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts\") pod \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\" (UID: \"71e4f362-7d19-48a7-a297-bae1fb8cdf8b\") " Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.081838 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c00b31dd-8e0a-40c6-8761-205f14bf1bde" (UID: "c00b31dd-8e0a-40c6-8761-205f14bf1bde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.082111 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71e4f362-7d19-48a7-a297-bae1fb8cdf8b" (UID: "71e4f362-7d19-48a7-a297-bae1fb8cdf8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.086943 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw" (OuterVolumeSpecName: "kube-api-access-9mktw") pod "c00b31dd-8e0a-40c6-8761-205f14bf1bde" (UID: "c00b31dd-8e0a-40c6-8761-205f14bf1bde"). InnerVolumeSpecName "kube-api-access-9mktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.094782 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7" (OuterVolumeSpecName: "kube-api-access-mjzk7") pod "71e4f362-7d19-48a7-a297-bae1fb8cdf8b" (UID: "71e4f362-7d19-48a7-a297-bae1fb8cdf8b"). InnerVolumeSpecName "kube-api-access-mjzk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.181979 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mktw\" (UniqueName: \"kubernetes.io/projected/c00b31dd-8e0a-40c6-8761-205f14bf1bde-kube-api-access-9mktw\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.182007 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00b31dd-8e0a-40c6-8761-205f14bf1bde-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.182018 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.182027 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjzk7\" (UniqueName: \"kubernetes.io/projected/71e4f362-7d19-48a7-a297-bae1fb8cdf8b-kube-api-access-mjzk7\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.632164 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dcvjk" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.632168 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dcvjk" event={"ID":"71e4f362-7d19-48a7-a297-bae1fb8cdf8b","Type":"ContainerDied","Data":"29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5"} Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.632317 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d7b82e075e09e3593a30ffc5b08383b7164bea8fd476ecc246168dcb2649a5" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.634271 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" event={"ID":"c00b31dd-8e0a-40c6-8761-205f14bf1bde","Type":"ContainerDied","Data":"fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401"} Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.634381 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa583ea13ea2746bfc697fbba7a90887df963d0a6f05507fa98cc88a48e1b401" Feb 17 18:07:36 crc kubenswrapper[4762]: I0217 18:07:36.634324 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.682103 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-g6bfq"] Feb 17 18:07:38 crc kubenswrapper[4762]: E0217 18:07:38.683862 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00b31dd-8e0a-40c6-8761-205f14bf1bde" containerName="mariadb-account-create-update" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.683963 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00b31dd-8e0a-40c6-8761-205f14bf1bde" containerName="mariadb-account-create-update" Feb 17 18:07:38 crc kubenswrapper[4762]: E0217 18:07:38.684075 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e4f362-7d19-48a7-a297-bae1fb8cdf8b" containerName="mariadb-database-create" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.684155 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e4f362-7d19-48a7-a297-bae1fb8cdf8b" containerName="mariadb-database-create" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.684424 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00b31dd-8e0a-40c6-8761-205f14bf1bde" containerName="mariadb-account-create-update" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.684521 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e4f362-7d19-48a7-a297-bae1fb8cdf8b" containerName="mariadb-database-create" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.685178 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.688344 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.688511 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2ljj9" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.691156 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g6bfq"] Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.822061 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.822488 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6f9f\" (UniqueName: \"kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.822582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.924509 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.924558 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6f9f\" (UniqueName: \"kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.924586 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.930864 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.930881 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.939775 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6f9f\" (UniqueName: \"kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f\") pod \"glance-db-sync-g6bfq\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:38 crc kubenswrapper[4762]: I0217 18:07:38.998705 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:39 crc kubenswrapper[4762]: I0217 18:07:39.974675 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g6bfq"] Feb 17 18:07:40 crc kubenswrapper[4762]: I0217 18:07:40.662703 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g6bfq" event={"ID":"99397293-7cbd-48dd-b637-0805fe66ddb8","Type":"ContainerStarted","Data":"55428f9c6f2ee74dd75d55f682f6c58aa8bc98e97bf3b6088476063fffc1b761"} Feb 17 18:07:40 crc kubenswrapper[4762]: I0217 18:07:40.663042 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g6bfq" event={"ID":"99397293-7cbd-48dd-b637-0805fe66ddb8","Type":"ContainerStarted","Data":"0afa7acc132cea63b9f3276672cbd6238b89acdafb8e0ec204d107286e8569ba"} Feb 17 18:07:40 crc kubenswrapper[4762]: I0217 18:07:40.677718 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-g6bfq" podStartSLOduration=2.677697725 podStartE2EDuration="2.677697725s" podCreationTimestamp="2026-02-17 18:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:40.675442321 +0000 UTC m=+1212.320360341" watchObservedRunningTime="2026-02-17 18:07:40.677697725 +0000 UTC m=+1212.322615745" Feb 17 18:07:43 crc kubenswrapper[4762]: I0217 18:07:43.687253 4762 generic.go:334] "Generic (PLEG): container finished" podID="99397293-7cbd-48dd-b637-0805fe66ddb8" containerID="55428f9c6f2ee74dd75d55f682f6c58aa8bc98e97bf3b6088476063fffc1b761" exitCode=0 Feb 17 18:07:43 crc kubenswrapper[4762]: I0217 18:07:43.687766 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g6bfq" event={"ID":"99397293-7cbd-48dd-b637-0805fe66ddb8","Type":"ContainerDied","Data":"55428f9c6f2ee74dd75d55f682f6c58aa8bc98e97bf3b6088476063fffc1b761"} Feb 17 18:07:44 crc kubenswrapper[4762]: I0217 18:07:44.990460 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.129648 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data\") pod \"99397293-7cbd-48dd-b637-0805fe66ddb8\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.129784 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6f9f\" (UniqueName: \"kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f\") pod \"99397293-7cbd-48dd-b637-0805fe66ddb8\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.129832 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data\") pod \"99397293-7cbd-48dd-b637-0805fe66ddb8\" (UID: \"99397293-7cbd-48dd-b637-0805fe66ddb8\") " Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.135096 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "99397293-7cbd-48dd-b637-0805fe66ddb8" (UID: "99397293-7cbd-48dd-b637-0805fe66ddb8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.140699 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f" (OuterVolumeSpecName: "kube-api-access-w6f9f") pod "99397293-7cbd-48dd-b637-0805fe66ddb8" (UID: "99397293-7cbd-48dd-b637-0805fe66ddb8"). InnerVolumeSpecName "kube-api-access-w6f9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.171586 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data" (OuterVolumeSpecName: "config-data") pod "99397293-7cbd-48dd-b637-0805fe66ddb8" (UID: "99397293-7cbd-48dd-b637-0805fe66ddb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.231849 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.231888 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6f9f\" (UniqueName: \"kubernetes.io/projected/99397293-7cbd-48dd-b637-0805fe66ddb8-kube-api-access-w6f9f\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.231902 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99397293-7cbd-48dd-b637-0805fe66ddb8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.701024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g6bfq" event={"ID":"99397293-7cbd-48dd-b637-0805fe66ddb8","Type":"ContainerDied","Data":"0afa7acc132cea63b9f3276672cbd6238b89acdafb8e0ec204d107286e8569ba"} Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.701072 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afa7acc132cea63b9f3276672cbd6238b89acdafb8e0ec204d107286e8569ba" Feb 17 18:07:45 crc kubenswrapper[4762]: I0217 18:07:45.701129 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g6bfq" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.860159 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:07:46 crc kubenswrapper[4762]: E0217 18:07:46.860790 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99397293-7cbd-48dd-b637-0805fe66ddb8" containerName="glance-db-sync" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.860806 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="99397293-7cbd-48dd-b637-0805fe66ddb8" containerName="glance-db-sync" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.860981 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="99397293-7cbd-48dd-b637-0805fe66ddb8" containerName="glance-db-sync" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.862106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.863893 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.864698 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2ljj9" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.873820 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.874344 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.957866 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.957939 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.957989 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958008 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958029 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958066 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958083 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958145 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958163 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958179 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4bl\" (UniqueName: \"kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:46 crc kubenswrapper[4762]: I0217 18:07:46.958253 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060053 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060123 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060154 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060211 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4bl\" (UniqueName: \"kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060237 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060352 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060518 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060684 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060718 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060695 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.060859 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061101 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061135 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061182 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061251 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061276 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061357 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061456 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.061464 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.064368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.077333 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.082881 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.088206 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.093436 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4bl\" (UniqueName: \"kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl\") pod \"glance-default-external-api-0\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.177722 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.196902 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.198810 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.201653 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.207874 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264271 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264317 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264341 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264360 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264443 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264485 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpl24\" (UniqueName: \"kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264509 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264539 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264614 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264655 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.264673 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.365795 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.365882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.365931 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpl24\" (UniqueName: \"kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.365957 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366169 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366189 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366214 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366247 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366289 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366356 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366383 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366410 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366438 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.366539 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.367015 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.367845 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368129 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368201 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368205 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368319 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368343 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.368378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.375922 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.387933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.390567 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpl24\" (UniqueName: \"kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.392876 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.398570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.428968 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.470280 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.565095 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.713544 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerStarted","Data":"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99"} Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.713986 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerStarted","Data":"56c1504f2dc330aeae28bad450e86ec359afa872bfd84500d5f337bdc8824cf4"} Feb 17 18:07:47 crc kubenswrapper[4762]: I0217 18:07:47.868952 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.022549 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:48 crc kubenswrapper[4762]: W0217 18:07:48.026323 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d319709_4dd2_4374_ad46_928f46113f48.slice/crio-800df78937c90a9ee19b0b9768f187e7c33949cc05a70915aa3de47a479f7ad8 WatchSource:0}: Error finding container 800df78937c90a9ee19b0b9768f187e7c33949cc05a70915aa3de47a479f7ad8: Status 404 returned error can't find the container with id 800df78937c90a9ee19b0b9768f187e7c33949cc05a70915aa3de47a479f7ad8 Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.722542 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerStarted","Data":"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.722832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerStarted","Data":"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.725886 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerStarted","Data":"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.725949 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerStarted","Data":"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.725972 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerStarted","Data":"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.725980 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-log" containerID="cri-o://2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" gracePeriod=30 Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.726087 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-api" containerID="cri-o://22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" gracePeriod=30 Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.725991 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerStarted","Data":"800df78937c90a9ee19b0b9768f187e7c33949cc05a70915aa3de47a479f7ad8"} Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.726163 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-httpd" containerID="cri-o://9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" gracePeriod=30 Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.759294 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.759266584 podStartE2EDuration="2.759266584s" podCreationTimestamp="2026-02-17 18:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:48.748330644 +0000 UTC m=+1220.393248654" watchObservedRunningTime="2026-02-17 18:07:48.759266584 +0000 UTC m=+1220.404184584" Feb 17 18:07:48 crc kubenswrapper[4762]: I0217 18:07:48.773877 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.773858619 podStartE2EDuration="2.773858619s" podCreationTimestamp="2026-02-17 18:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:48.771358098 +0000 UTC m=+1220.416276108" watchObservedRunningTime="2026-02-17 18:07:48.773858619 +0000 UTC m=+1220.418776629" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.120502 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.200969 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201053 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201098 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201128 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201144 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201199 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201236 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201250 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201303 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201333 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.201354 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpl24\" (UniqueName: \"kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24\") pod \"6d319709-4dd2-4374-ad46-928f46113f48\" (UID: \"6d319709-4dd2-4374-ad46-928f46113f48\") " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203006 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs" (OuterVolumeSpecName: "logs") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203077 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run" (OuterVolumeSpecName: "run") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203104 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys" (OuterVolumeSpecName: "sys") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203458 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev" (OuterVolumeSpecName: "dev") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203514 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203733 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203752 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203766 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.203752 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.208299 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts" (OuterVolumeSpecName: "scripts") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.208558 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24" (OuterVolumeSpecName: "kube-api-access-tpl24") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "kube-api-access-tpl24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.208815 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance-cache") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.211701 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.279072 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data" (OuterVolumeSpecName: "config-data") pod "6d319709-4dd2-4374-ad46-928f46113f48" (UID: "6d319709-4dd2-4374-ad46-928f46113f48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303548 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303593 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpl24\" (UniqueName: \"kubernetes.io/projected/6d319709-4dd2-4374-ad46-928f46113f48-kube-api-access-tpl24\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303606 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303632 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303646 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303656 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303667 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303678 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d319709-4dd2-4374-ad46-928f46113f48-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303704 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303718 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303730 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303743 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d319709-4dd2-4374-ad46-928f46113f48-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303754 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d319709-4dd2-4374-ad46-928f46113f48-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.303772 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.328400 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.329662 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.405079 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.405411 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734786 4762 generic.go:334] "Generic (PLEG): container finished" podID="6d319709-4dd2-4374-ad46-928f46113f48" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" exitCode=143 Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734829 4762 generic.go:334] "Generic (PLEG): container finished" podID="6d319709-4dd2-4374-ad46-928f46113f48" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" exitCode=143 Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734840 4762 generic.go:334] "Generic (PLEG): container finished" podID="6d319709-4dd2-4374-ad46-928f46113f48" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" exitCode=143 Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734858 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerDied","Data":"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81"} Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerDied","Data":"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb"} Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734908 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerDied","Data":"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0"} Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734919 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6d319709-4dd2-4374-ad46-928f46113f48","Type":"ContainerDied","Data":"800df78937c90a9ee19b0b9768f187e7c33949cc05a70915aa3de47a479f7ad8"} Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734940 4762 scope.go:117] "RemoveContainer" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.734840 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.763450 4762 scope.go:117] "RemoveContainer" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.791090 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.810782 4762 scope.go:117] "RemoveContainer" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.855723 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.864734 4762 scope.go:117] "RemoveContainer" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.868275 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": container with ID starting with 22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81 not found: ID does not exist" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.868342 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81"} err="failed to get container status \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": rpc error: code = NotFound desc = could not find container \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": container with ID starting with 22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.868369 4762 scope.go:117] "RemoveContainer" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.885295 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": container with ID starting with 9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb not found: ID does not exist" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885345 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb"} err="failed to get container status \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": rpc error: code = NotFound desc = could not find container \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": container with ID starting with 9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885369 4762 scope.go:117] "RemoveContainer" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885465 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.885781 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-httpd" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885798 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-httpd" Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.885823 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-log" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885830 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-log" Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.885839 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-api" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885844 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-api" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885963 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-api" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885980 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-log" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.885994 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d319709-4dd2-4374-ad46-928f46113f48" containerName="glance-httpd" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.886849 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:49 crc kubenswrapper[4762]: E0217 18:07:49.887572 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": container with ID starting with 2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0 not found: ID does not exist" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.887588 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0"} err="failed to get container status \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": rpc error: code = NotFound desc = could not find container \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": container with ID starting with 2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.887600 4762 scope.go:117] "RemoveContainer" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.888502 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81"} err="failed to get container status \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": rpc error: code = NotFound desc = could not find container \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": container with ID starting with 22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.888517 4762 scope.go:117] "RemoveContainer" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.889507 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb"} err="failed to get container status \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": rpc error: code = NotFound desc = could not find container \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": container with ID starting with 9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.889524 4762 scope.go:117] "RemoveContainer" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.890938 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0"} err="failed to get container status \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": rpc error: code = NotFound desc = could not find container \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": container with ID starting with 2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.890988 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.890993 4762 scope.go:117] "RemoveContainer" containerID="22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.891305 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81"} err="failed to get container status \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": rpc error: code = NotFound desc = could not find container \"22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81\": container with ID starting with 22be9ec8d51b2bf3d2cf921d8dcda6d734dc2340b9686292e3728f2a7f5baf81 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.891323 4762 scope.go:117] "RemoveContainer" containerID="9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.891566 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb"} err="failed to get container status \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": rpc error: code = NotFound desc = could not find container \"9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb\": container with ID starting with 9ba2892052d4f8304bbf39642953bedbfd891bafab5e1bf39c370e4586450fdb not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.891589 4762 scope.go:117] "RemoveContainer" containerID="2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.891872 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0"} err="failed to get container status \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": rpc error: code = NotFound desc = could not find container \"2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0\": container with ID starting with 2b5afdc0d4c1b1f2c96aaa22bbd4eda5c9e307d65fbec6cf1cc0831c31dee0f0 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4762]: I0217 18:07:49.900948 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.018837 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.018882 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.018903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.018928 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.018956 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019035 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019060 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019093 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019113 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019137 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019165 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.019180 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2ztt\" (UniqueName: \"kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120554 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120687 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120716 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120736 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120749 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120685 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120771 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120822 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120868 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120898 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120940 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2ztt\" (UniqueName: \"kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.120993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121066 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121181 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121289 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121295 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121412 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121558 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.121935 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.130980 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.139151 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.143454 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.145303 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.158700 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2ztt\" (UniqueName: \"kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt\") pod \"glance-default-internal-api-0\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.203937 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.656181 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:07:50 crc kubenswrapper[4762]: I0217 18:07:50.744480 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerStarted","Data":"2c7f32adad765678fc7a42bde5358b58d98d838f5eb75a2207849dd062048673"} Feb 17 18:07:51 crc kubenswrapper[4762]: I0217 18:07:51.046832 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d319709-4dd2-4374-ad46-928f46113f48" path="/var/lib/kubelet/pods/6d319709-4dd2-4374-ad46-928f46113f48/volumes" Feb 17 18:07:51 crc kubenswrapper[4762]: I0217 18:07:51.753922 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerStarted","Data":"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7"} Feb 17 18:07:51 crc kubenswrapper[4762]: I0217 18:07:51.754287 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerStarted","Data":"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1"} Feb 17 18:07:51 crc kubenswrapper[4762]: I0217 18:07:51.754303 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerStarted","Data":"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0"} Feb 17 18:07:51 crc kubenswrapper[4762]: I0217 18:07:51.775358 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.775342394 podStartE2EDuration="2.775342394s" podCreationTimestamp="2026-02-17 18:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:07:51.77449451 +0000 UTC m=+1223.419412520" watchObservedRunningTime="2026-02-17 18:07:51.775342394 +0000 UTC m=+1223.420260404" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.177879 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.178492 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.178507 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.203385 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.211600 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.218724 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.804247 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.804301 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.804318 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.818016 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.818143 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:07:57 crc kubenswrapper[4762]: I0217 18:07:57.820129 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.205198 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.205641 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.205657 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.227132 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.228538 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.240305 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.821241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.821571 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.821582 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.832590 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.833183 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:00 crc kubenswrapper[4762]: I0217 18:08:00.833232 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.735913 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.738746 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.745437 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.747371 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.755377 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.763859 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.819985 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820243 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820267 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7gl\" (UniqueName: \"kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820336 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820376 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820416 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820451 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820637 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820675 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820706 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820790 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820828 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820875 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820913 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.820958 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwf79\" (UniqueName: \"kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821043 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821068 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821090 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821135 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821160 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.821183 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.875176 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.876798 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.883835 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.885798 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.893434 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.907808 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928265 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928283 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928305 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928324 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928344 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928362 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwf79\" (UniqueName: \"kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928394 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928410 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928438 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928482 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928498 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928516 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928619 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7gl\" (UniqueName: \"kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928692 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928711 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928742 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928758 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928785 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.928819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929113 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929182 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929290 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929322 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929329 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929563 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929609 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929649 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929675 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929718 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929719 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929793 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.929886 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.930057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.930171 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.930419 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.930450 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.930512 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.934344 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.939991 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.940309 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.942561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.947258 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.968098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.971554 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwf79\" (UniqueName: \"kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.974407 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.978924 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.981999 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7gl\" (UniqueName: \"kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl\") pod \"glance-default-external-api-2\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:02 crc kubenswrapper[4762]: I0217 18:08:02.999069 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032271 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032355 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032381 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032413 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032434 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032551 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032615 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032668 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.032695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033008 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033035 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033058 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033096 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphh2\" (UniqueName: \"kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033119 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033139 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033162 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6swn7\" (UniqueName: \"kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033182 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033220 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033242 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033266 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033289 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.033336 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.066201 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.077800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135151 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135213 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135251 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135279 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphh2\" (UniqueName: \"kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135318 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135338 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6swn7\" (UniqueName: \"kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135376 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135408 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135425 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135459 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135604 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135699 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135724 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.135993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136032 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136061 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136077 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136150 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136484 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136674 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136676 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137022 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137095 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137177 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137181 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137209 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137241 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137599 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137617 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137822 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137888 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.137987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.136416 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.138374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.140825 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.141604 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.143009 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.152955 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.158892 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6swn7\" (UniqueName: \"kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.163650 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphh2\" (UniqueName: \"kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.168282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.175662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.176113 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.179201 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-2\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.207586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.220348 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:03 crc kubenswrapper[4762]: W0217 18:08:03.500416 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5f95fec_67fe_4a75_9aed_6ac0944ea78e.slice/crio-09be42799ef048445dac3e5250ffbc32cb2269ae259d6584dd2b4f9f95cd92be WatchSource:0}: Error finding container 09be42799ef048445dac3e5250ffbc32cb2269ae259d6584dd2b4f9f95cd92be: Status 404 returned error can't find the container with id 09be42799ef048445dac3e5250ffbc32cb2269ae259d6584dd2b4f9f95cd92be Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.501610 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.567372 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:03 crc kubenswrapper[4762]: W0217 18:08:03.589541 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fcc6fe_4adb_4638_a9b4_bbe2b968954f.slice/crio-0d105604bda61cc7aca1fb015d58e4446c92288884709208364699f84deba79f WatchSource:0}: Error finding container 0d105604bda61cc7aca1fb015d58e4446c92288884709208364699f84deba79f: Status 404 returned error can't find the container with id 0d105604bda61cc7aca1fb015d58e4446c92288884709208364699f84deba79f Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.664719 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.674401 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.842173 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerStarted","Data":"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.842534 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerStarted","Data":"0d105604bda61cc7aca1fb015d58e4446c92288884709208364699f84deba79f"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.843429 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerStarted","Data":"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.843458 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerStarted","Data":"2f5dcf1b8c8569d1e2281bc31d28720767bfc8da0990d8c8e101e5dab807a674"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.845296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerStarted","Data":"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.845321 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerStarted","Data":"6137cdf75124a354ba37b0935c3d73645cdd76d28edabc710782249cd30978cd"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.847064 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerStarted","Data":"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.847089 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerStarted","Data":"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761"} Feb 17 18:08:03 crc kubenswrapper[4762]: I0217 18:08:03.847100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerStarted","Data":"09be42799ef048445dac3e5250ffbc32cb2269ae259d6584dd2b4f9f95cd92be"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.857763 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerStarted","Data":"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.858542 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerStarted","Data":"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.861289 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerStarted","Data":"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.861418 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerStarted","Data":"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.863485 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerStarted","Data":"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.863571 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerStarted","Data":"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.866685 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerStarted","Data":"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc"} Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.890320 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.890296748 podStartE2EDuration="3.890296748s" podCreationTimestamp="2026-02-17 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:04.889921937 +0000 UTC m=+1236.534840017" watchObservedRunningTime="2026-02-17 18:08:04.890296748 +0000 UTC m=+1236.535214758" Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.924285 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.924270225 podStartE2EDuration="3.924270225s" podCreationTimestamp="2026-02-17 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:04.92374791 +0000 UTC m=+1236.568665930" watchObservedRunningTime="2026-02-17 18:08:04.924270225 +0000 UTC m=+1236.569188235" Feb 17 18:08:04 crc kubenswrapper[4762]: I0217 18:08:04.970851 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.97082812 podStartE2EDuration="3.97082812s" podCreationTimestamp="2026-02-17 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:04.952474017 +0000 UTC m=+1236.597392027" watchObservedRunningTime="2026-02-17 18:08:04.97082812 +0000 UTC m=+1236.615746130" Feb 17 18:08:05 crc kubenswrapper[4762]: I0217 18:08:05.003164 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=4.003142379 podStartE2EDuration="4.003142379s" podCreationTimestamp="2026-02-17 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:04.994404871 +0000 UTC m=+1236.639322881" watchObservedRunningTime="2026-02-17 18:08:05.003142379 +0000 UTC m=+1236.648060389" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.066958 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.067592 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.067609 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.079116 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.079170 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.079185 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.089132 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.089240 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.106449 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.107042 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.108842 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.124353 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.208830 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.209344 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.209417 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.220805 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.220924 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.221165 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.234862 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.238722 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.247041 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.247317 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.249169 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.260121 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931373 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931714 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931730 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931742 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931752 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931761 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931771 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931780 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931787 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931795 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931805 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.931823 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.943586 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.943676 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.944124 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.944401 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.949054 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.949446 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.950205 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.951819 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.952046 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.952329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.953018 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:13 crc kubenswrapper[4762]: I0217 18:08:13.953511 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:15 crc kubenswrapper[4762]: I0217 18:08:15.401101 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:15 crc kubenswrapper[4762]: I0217 18:08:15.408119 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:15 crc kubenswrapper[4762]: I0217 18:08:15.604247 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:15 crc kubenswrapper[4762]: I0217 18:08:15.611324 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947295 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-log" containerID="cri-o://ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947701 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-httpd" containerID="cri-o://c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947389 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-api" containerID="cri-o://db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947389 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-httpd" containerID="cri-o://e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947436 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-api" containerID="cri-o://90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947446 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-httpd" containerID="cri-o://ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947414 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-log" containerID="cri-o://56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947672 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-log" containerID="cri-o://4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947687 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-api" containerID="cri-o://79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947926 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-log" containerID="cri-o://38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947954 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-api" containerID="cri-o://f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" gracePeriod=30 Feb 17 18:08:16 crc kubenswrapper[4762]: I0217 18:08:16.947989 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-httpd" containerID="cri-o://f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" gracePeriod=30 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.750778 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.823707 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.824951 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899202 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899265 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899313 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899330 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899376 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899413 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899435 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899454 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899472 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwf79\" (UniqueName: \"kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899510 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899527 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899550 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899572 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.899613 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev\") pod \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\" (UID: \"d5f95fec-67fe-4a75-9aed-6ac0944ea78e\") " Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.900053 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev" (OuterVolumeSpecName: "dev") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.900081 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.900099 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run" (OuterVolumeSpecName: "run") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.900446 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.900755 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.901280 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.901328 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys" (OuterVolumeSpecName: "sys") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.901509 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.901930 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs" (OuterVolumeSpecName: "logs") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.906860 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.908364 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.908457 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts" (OuterVolumeSpecName: "scripts") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.908652 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79" (OuterVolumeSpecName: "kube-api-access-vwf79") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "kube-api-access-vwf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.910520 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.962343 4762 generic.go:334] "Generic (PLEG): container finished" podID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.962401 4762 generic.go:334] "Generic (PLEG): container finished" podID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.962413 4762 generic.go:334] "Generic (PLEG): container finished" podID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" exitCode=143 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.962573 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.963567 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerDied","Data":"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.963599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerDied","Data":"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.963653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerDied","Data":"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.963668 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"d5f95fec-67fe-4a75-9aed-6ac0944ea78e","Type":"ContainerDied","Data":"09be42799ef048445dac3e5250ffbc32cb2269ae259d6584dd2b4f9f95cd92be"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.963686 4762 scope.go:117] "RemoveContainer" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968162 4762 generic.go:334] "Generic (PLEG): container finished" podID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968193 4762 generic.go:334] "Generic (PLEG): container finished" podID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968200 4762 generic.go:334] "Generic (PLEG): container finished" podID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" exitCode=143 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968241 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerDied","Data":"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968265 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerDied","Data":"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968275 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerDied","Data":"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968284 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"79fcc6fe-4adb-4638-a9b4-bbe2b968954f","Type":"ContainerDied","Data":"0d105604bda61cc7aca1fb015d58e4446c92288884709208364699f84deba79f"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.968340 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972017 4762 generic.go:334] "Generic (PLEG): container finished" podID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972052 4762 generic.go:334] "Generic (PLEG): container finished" podID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972063 4762 generic.go:334] "Generic (PLEG): container finished" podID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" exitCode=143 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972127 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972151 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerDied","Data":"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972182 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerDied","Data":"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972196 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerDied","Data":"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.972213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e363f68f-6964-4929-a32e-a0c55a4dabef","Type":"ContainerDied","Data":"2f5dcf1b8c8569d1e2281bc31d28720767bfc8da0990d8c8e101e5dab807a674"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975541 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975570 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" exitCode=0 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975578 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" exitCode=143 Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975596 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerDied","Data":"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975633 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerDied","Data":"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975644 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerDied","Data":"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975654 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"3f51072c-35bd-4c70-ac0a-307406b3dcc8","Type":"ContainerDied","Data":"6137cdf75124a354ba37b0935c3d73645cdd76d28edabc710782249cd30978cd"} Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.975703 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.983352 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data" (OuterVolumeSpecName: "config-data") pod "d5f95fec-67fe-4a75-9aed-6ac0944ea78e" (UID: "d5f95fec-67fe-4a75-9aed-6ac0944ea78e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:17 crc kubenswrapper[4762]: I0217 18:08:17.992806 4762 scope.go:117] "RemoveContainer" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001143 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001200 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001250 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001266 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001292 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001310 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001330 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001348 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001378 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001393 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001439 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001458 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001471 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001493 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001523 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001669 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001700 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001721 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001747 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7gl\" (UniqueName: \"kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001760 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001762 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001779 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001800 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001816 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001825 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphh2\" (UniqueName: \"kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001851 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev" (OuterVolumeSpecName: "dev") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001887 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001903 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run\") pod \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\" (UID: \"79fcc6fe-4adb-4638-a9b4-bbe2b968954f\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.001932 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules\") pod \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\" (UID: \"3f51072c-35bd-4c70-ac0a-307406b3dcc8\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002213 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002246 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002267 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys" (OuterVolumeSpecName: "sys") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002381 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev" (OuterVolumeSpecName: "dev") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002455 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002472 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002480 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002490 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002498 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002507 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002515 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002523 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002532 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002551 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002561 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002570 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002578 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002587 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002455 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002637 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys" (OuterVolumeSpecName: "sys") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002636 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002596 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwf79\" (UniqueName: \"kubernetes.io/projected/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-kube-api-access-vwf79\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002732 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002756 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002773 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d5f95fec-67fe-4a75-9aed-6ac0944ea78e-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs" (OuterVolumeSpecName: "logs") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002818 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.002833 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.003173 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.003210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.003236 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run" (OuterVolumeSpecName: "run") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.003258 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run" (OuterVolumeSpecName: "run") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.003822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs" (OuterVolumeSpecName: "logs") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.005858 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.006503 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2" (OuterVolumeSpecName: "kube-api-access-qphh2") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "kube-api-access-qphh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.006736 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.007008 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.007089 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts" (OuterVolumeSpecName: "scripts") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.008222 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts" (OuterVolumeSpecName: "scripts") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.009231 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl" (OuterVolumeSpecName: "kube-api-access-cl7gl") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "kube-api-access-cl7gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.015455 4762 scope.go:117] "RemoveContainer" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.018148 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.018851 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.018843 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.032772 4762 scope.go:117] "RemoveContainer" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.033304 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": container with ID starting with db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc not found: ID does not exist" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.033400 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc"} err="failed to get container status \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": rpc error: code = NotFound desc = could not find container \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": container with ID starting with db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.033521 4762 scope.go:117] "RemoveContainer" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.034247 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": container with ID starting with e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793 not found: ID does not exist" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.034302 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793"} err="failed to get container status \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": rpc error: code = NotFound desc = could not find container \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": container with ID starting with e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.034337 4762 scope.go:117] "RemoveContainer" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.034677 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": container with ID starting with ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761 not found: ID does not exist" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.034783 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761"} err="failed to get container status \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": rpc error: code = NotFound desc = could not find container \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": container with ID starting with ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.034853 4762 scope.go:117] "RemoveContainer" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035194 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc"} err="failed to get container status \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": rpc error: code = NotFound desc = could not find container \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": container with ID starting with db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035246 4762 scope.go:117] "RemoveContainer" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035509 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793"} err="failed to get container status \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": rpc error: code = NotFound desc = could not find container \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": container with ID starting with e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035531 4762 scope.go:117] "RemoveContainer" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035747 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761"} err="failed to get container status \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": rpc error: code = NotFound desc = could not find container \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": container with ID starting with ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035766 4762 scope.go:117] "RemoveContainer" containerID="db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035957 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc"} err="failed to get container status \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": rpc error: code = NotFound desc = could not find container \"db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc\": container with ID starting with db5051d1df06a7a39823c85df1f6891e7e77615d42233c0b9831101586ef1acc not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.035972 4762 scope.go:117] "RemoveContainer" containerID="e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.036120 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793"} err="failed to get container status \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": rpc error: code = NotFound desc = could not find container \"e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793\": container with ID starting with e512d23785ac37274a26bffd2db66fc16c199728e89edda8874314c6b3784793 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.036134 4762 scope.go:117] "RemoveContainer" containerID="ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.036283 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761"} err="failed to get container status \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": rpc error: code = NotFound desc = could not find container \"ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761\": container with ID starting with ac31b79845a1b940fdfc38afc25be88b3bc4fc90ce9cf17a688aee553bd94761 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.036297 4762 scope.go:117] "RemoveContainer" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.065594 4762 scope.go:117] "RemoveContainer" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.091728 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data" (OuterVolumeSpecName: "config-data") pod "3f51072c-35bd-4c70-ac0a-307406b3dcc8" (UID: "3f51072c-35bd-4c70-ac0a-307406b3dcc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.101435 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data" (OuterVolumeSpecName: "config-data") pod "79fcc6fe-4adb-4638-a9b4-bbe2b968954f" (UID: "79fcc6fe-4adb-4638-a9b4-bbe2b968954f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.103952 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.103983 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104005 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104134 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6swn7\" (UniqueName: \"kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104167 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104187 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104207 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104232 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104265 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104283 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104320 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104351 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104413 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run\") pod \"e363f68f-6964-4929-a32e-a0c55a4dabef\" (UID: \"e363f68f-6964-4929-a32e-a0c55a4dabef\") " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104745 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104778 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104794 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphh2\" (UniqueName: \"kubernetes.io/projected/3f51072c-35bd-4c70-ac0a-307406b3dcc8-kube-api-access-qphh2\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104806 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104817 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104828 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104838 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104854 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104865 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104875 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104886 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104903 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104919 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104931 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104943 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104954 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104965 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104975 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104986 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f51072c-35bd-4c70-ac0a-307406b3dcc8-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.104996 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f51072c-35bd-4c70-ac0a-307406b3dcc8-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105007 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f51072c-35bd-4c70-ac0a-307406b3dcc8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105020 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105031 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7gl\" (UniqueName: \"kubernetes.io/projected/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-kube-api-access-cl7gl\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105043 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79fcc6fe-4adb-4638-a9b4-bbe2b968954f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105062 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105447 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs" (OuterVolumeSpecName: "logs") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105830 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105855 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev" (OuterVolumeSpecName: "dev") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.105877 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.106174 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys" (OuterVolumeSpecName: "sys") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.106180 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run" (OuterVolumeSpecName: "run") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.106273 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.106365 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.109022 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7" (OuterVolumeSpecName: "kube-api-access-6swn7") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "kube-api-access-6swn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.109751 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.111408 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts" (OuterVolumeSpecName: "scripts") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.113789 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.120794 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.120965 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.122945 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.124281 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.192657 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data" (OuterVolumeSpecName: "config-data") pod "e363f68f-6964-4929-a32e-a0c55a4dabef" (UID: "e363f68f-6964-4929-a32e-a0c55a4dabef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208350 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208390 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208403 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208415 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208427 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6swn7\" (UniqueName: \"kubernetes.io/projected/e363f68f-6964-4929-a32e-a0c55a4dabef-kube-api-access-6swn7\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208454 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208468 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208486 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208499 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208511 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208523 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208535 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208545 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208557 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e363f68f-6964-4929-a32e-a0c55a4dabef-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208568 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e363f68f-6964-4929-a32e-a0c55a4dabef-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208580 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e363f68f-6964-4929-a32e-a0c55a4dabef-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208592 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.208602 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.220247 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.221950 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.238585 4762 scope.go:117] "RemoveContainer" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.259703 4762 scope.go:117] "RemoveContainer" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.260138 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": container with ID starting with 79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2 not found: ID does not exist" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260162 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2"} err="failed to get container status \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": rpc error: code = NotFound desc = could not find container \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": container with ID starting with 79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260181 4762 scope.go:117] "RemoveContainer" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.260504 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": container with ID starting with c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822 not found: ID does not exist" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260525 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822"} err="failed to get container status \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": rpc error: code = NotFound desc = could not find container \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": container with ID starting with c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260550 4762 scope.go:117] "RemoveContainer" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.260758 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": container with ID starting with 4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3 not found: ID does not exist" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260775 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3"} err="failed to get container status \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": rpc error: code = NotFound desc = could not find container \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": container with ID starting with 4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260786 4762 scope.go:117] "RemoveContainer" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.260996 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2"} err="failed to get container status \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": rpc error: code = NotFound desc = could not find container \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": container with ID starting with 79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261017 4762 scope.go:117] "RemoveContainer" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261196 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822"} err="failed to get container status \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": rpc error: code = NotFound desc = could not find container \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": container with ID starting with c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261213 4762 scope.go:117] "RemoveContainer" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261445 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3"} err="failed to get container status \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": rpc error: code = NotFound desc = could not find container \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": container with ID starting with 4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261460 4762 scope.go:117] "RemoveContainer" containerID="79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261784 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2"} err="failed to get container status \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": rpc error: code = NotFound desc = could not find container \"79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2\": container with ID starting with 79a80de0a2ec8d16c139563aaaf42c33d5ee9f6dc87636cb02af95f3946946e2 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261802 4762 scope.go:117] "RemoveContainer" containerID="c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261985 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822"} err="failed to get container status \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": rpc error: code = NotFound desc = could not find container \"c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822\": container with ID starting with c16ee8f043ee8e6b45b01322ba2503deb1d20998129544fe5a66469813244822 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.261999 4762 scope.go:117] "RemoveContainer" containerID="4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.262184 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3"} err="failed to get container status \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": rpc error: code = NotFound desc = could not find container \"4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3\": container with ID starting with 4c1f415f8e5b2abf81a26e9b82054a9bcb8778aae936514e414194e5832359f3 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.262207 4762 scope.go:117] "RemoveContainer" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.281128 4762 scope.go:117] "RemoveContainer" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.299313 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.310584 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.310645 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.313463 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.316139 4762 scope.go:117] "RemoveContainer" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.329369 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.335145 4762 scope.go:117] "RemoveContainer" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.335736 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": container with ID starting with 90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1 not found: ID does not exist" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.335796 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1"} err="failed to get container status \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": rpc error: code = NotFound desc = could not find container \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": container with ID starting with 90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.335824 4762 scope.go:117] "RemoveContainer" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.336125 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": container with ID starting with ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd not found: ID does not exist" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336180 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd"} err="failed to get container status \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": rpc error: code = NotFound desc = could not find container \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": container with ID starting with ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336197 4762 scope.go:117] "RemoveContainer" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.336459 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": container with ID starting with 56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad not found: ID does not exist" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336490 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad"} err="failed to get container status \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": rpc error: code = NotFound desc = could not find container \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": container with ID starting with 56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336509 4762 scope.go:117] "RemoveContainer" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336858 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1"} err="failed to get container status \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": rpc error: code = NotFound desc = could not find container \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": container with ID starting with 90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.336897 4762 scope.go:117] "RemoveContainer" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337355 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd"} err="failed to get container status \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": rpc error: code = NotFound desc = could not find container \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": container with ID starting with ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337374 4762 scope.go:117] "RemoveContainer" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337661 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad"} err="failed to get container status \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": rpc error: code = NotFound desc = could not find container \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": container with ID starting with 56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337681 4762 scope.go:117] "RemoveContainer" containerID="90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337944 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1"} err="failed to get container status \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": rpc error: code = NotFound desc = could not find container \"90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1\": container with ID starting with 90511042bdc7a452418e0d65082518784cf12b8a8e22f9df4c2706790e5290f1 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.337966 4762 scope.go:117] "RemoveContainer" containerID="ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.338914 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd"} err="failed to get container status \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": rpc error: code = NotFound desc = could not find container \"ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd\": container with ID starting with ba5476d7f8b52caee606b95d6881b3b1960bf8d96995cffe195197a2f2b783fd not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.338945 4762 scope.go:117] "RemoveContainer" containerID="56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.339285 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad"} err="failed to get container status \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": rpc error: code = NotFound desc = could not find container \"56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad\": container with ID starting with 56478c748827c9dc4459fe156ac3c403795d8c7024f29a38bd2e09a8228a5cad not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.339314 4762 scope.go:117] "RemoveContainer" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.339524 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.350741 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.357786 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.358958 4762 scope.go:117] "RemoveContainer" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.371966 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.378960 4762 scope.go:117] "RemoveContainer" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.383131 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.395589 4762 scope.go:117] "RemoveContainer" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.396674 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": container with ID starting with f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7 not found: ID does not exist" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.396709 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7"} err="failed to get container status \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": rpc error: code = NotFound desc = could not find container \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": container with ID starting with f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.396731 4762 scope.go:117] "RemoveContainer" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.396963 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": container with ID starting with f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434 not found: ID does not exist" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.396983 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434"} err="failed to get container status \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": rpc error: code = NotFound desc = could not find container \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": container with ID starting with f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.396995 4762 scope.go:117] "RemoveContainer" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" Feb 17 18:08:18 crc kubenswrapper[4762]: E0217 18:08:18.397156 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": container with ID starting with 38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f not found: ID does not exist" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397180 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f"} err="failed to get container status \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": rpc error: code = NotFound desc = could not find container \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": container with ID starting with 38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397193 4762 scope.go:117] "RemoveContainer" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397576 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7"} err="failed to get container status \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": rpc error: code = NotFound desc = could not find container \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": container with ID starting with f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397596 4762 scope.go:117] "RemoveContainer" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397812 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434"} err="failed to get container status \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": rpc error: code = NotFound desc = could not find container \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": container with ID starting with f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.397830 4762 scope.go:117] "RemoveContainer" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398025 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f"} err="failed to get container status \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": rpc error: code = NotFound desc = could not find container \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": container with ID starting with 38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398049 4762 scope.go:117] "RemoveContainer" containerID="f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398268 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7"} err="failed to get container status \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": rpc error: code = NotFound desc = could not find container \"f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7\": container with ID starting with f86cce396d9cb89694dc04018ab98bf566fcb009504369bfa7e53ce95653b9b7 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398293 4762 scope.go:117] "RemoveContainer" containerID="f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398516 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434"} err="failed to get container status \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": rpc error: code = NotFound desc = could not find container \"f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434\": container with ID starting with f93dc8cad07df750b47f20c8137fc37bbc3dcbc9f803ee38e368a4cbb2934434 not found: ID does not exist" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398543 4762 scope.go:117] "RemoveContainer" containerID="38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f" Feb 17 18:08:18 crc kubenswrapper[4762]: I0217 18:08:18.398800 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f"} err="failed to get container status \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": rpc error: code = NotFound desc = could not find container \"38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f\": container with ID starting with 38569f8d9bb225412ff9f8bf6bcb1a3dbabb596e98872bea6dc3000c7f33cb1f not found: ID does not exist" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.045549 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" path="/var/lib/kubelet/pods/3f51072c-35bd-4c70-ac0a-307406b3dcc8/volumes" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.046578 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" path="/var/lib/kubelet/pods/79fcc6fe-4adb-4638-a9b4-bbe2b968954f/volumes" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.048195 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" path="/var/lib/kubelet/pods/d5f95fec-67fe-4a75-9aed-6ac0944ea78e/volumes" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.048924 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" path="/var/lib/kubelet/pods/e363f68f-6964-4929-a32e-a0c55a4dabef/volumes" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.146418 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.146723 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-log" containerID="cri-o://d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.146807 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-httpd" containerID="cri-o://5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.146821 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-api" containerID="cri-o://3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.652574 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.653133 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-api" containerID="cri-o://a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.653196 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-log" containerID="cri-o://61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.653145 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-httpd" containerID="cri-o://f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311" gracePeriod=30 Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.899285 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945146 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945264 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945306 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945363 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945407 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945424 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945428 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev" (OuterVolumeSpecName: "dev") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945461 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945489 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945515 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945570 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2ztt\" (UniqueName: \"kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945594 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945641 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945691 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run\") pod \"64958179-a093-4f81-a142-ae9b2f42b19c\" (UID: \"64958179-a093-4f81-a142-ae9b2f42b19c\") " Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946036 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945498 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945564 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.945586 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys" (OuterVolumeSpecName: "sys") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946358 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946357 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946389 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946444 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run" (OuterVolumeSpecName: "run") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.946695 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs" (OuterVolumeSpecName: "logs") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.952069 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance-cache") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.952335 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.952335 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts" (OuterVolumeSpecName: "scripts") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:19 crc kubenswrapper[4762]: I0217 18:08:19.953995 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt" (OuterVolumeSpecName: "kube-api-access-c2ztt") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "kube-api-access-c2ztt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007279 4762 generic.go:334] "Generic (PLEG): container finished" podID="64958179-a093-4f81-a142-ae9b2f42b19c" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" exitCode=0 Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007316 4762 generic.go:334] "Generic (PLEG): container finished" podID="64958179-a093-4f81-a142-ae9b2f42b19c" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" exitCode=0 Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007324 4762 generic.go:334] "Generic (PLEG): container finished" podID="64958179-a093-4f81-a142-ae9b2f42b19c" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" exitCode=143 Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007335 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007378 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerDied","Data":"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerDied","Data":"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007446 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerDied","Data":"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007458 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"64958179-a093-4f81-a142-ae9b2f42b19c","Type":"ContainerDied","Data":"2c7f32adad765678fc7a42bde5358b58d98d838f5eb75a2207849dd062048673"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.007477 4762 scope.go:117] "RemoveContainer" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.012414 4762 generic.go:334] "Generic (PLEG): container finished" podID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerID="f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311" exitCode=0 Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.012442 4762 generic.go:334] "Generic (PLEG): container finished" podID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerID="61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99" exitCode=143 Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.012464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerDied","Data":"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.012488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerDied","Data":"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99"} Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.021912 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data" (OuterVolumeSpecName: "config-data") pod "64958179-a093-4f81-a142-ae9b2f42b19c" (UID: "64958179-a093-4f81-a142-ae9b2f42b19c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.029191 4762 scope.go:117] "RemoveContainer" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.049881 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.049918 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.049954 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.049967 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050001 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050012 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050022 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050034 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050044 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2ztt\" (UniqueName: \"kubernetes.io/projected/64958179-a093-4f81-a142-ae9b2f42b19c-kube-api-access-c2ztt\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050053 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64958179-a093-4f81-a142-ae9b2f42b19c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050062 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64958179-a093-4f81-a142-ae9b2f42b19c-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050071 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64958179-a093-4f81-a142-ae9b2f42b19c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.050087 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.051283 4762 scope.go:117] "RemoveContainer" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.062773 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.064092 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.073764 4762 scope.go:117] "RemoveContainer" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" Feb 17 18:08:20 crc kubenswrapper[4762]: E0217 18:08:20.074365 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": container with ID starting with 3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7 not found: ID does not exist" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.077240 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7"} err="failed to get container status \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": rpc error: code = NotFound desc = could not find container \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": container with ID starting with 3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.077304 4762 scope.go:117] "RemoveContainer" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" Feb 17 18:08:20 crc kubenswrapper[4762]: E0217 18:08:20.078129 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": container with ID starting with 5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1 not found: ID does not exist" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.078163 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1"} err="failed to get container status \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": rpc error: code = NotFound desc = could not find container \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": container with ID starting with 5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.078183 4762 scope.go:117] "RemoveContainer" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" Feb 17 18:08:20 crc kubenswrapper[4762]: E0217 18:08:20.078676 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": container with ID starting with d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0 not found: ID does not exist" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.078717 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0"} err="failed to get container status \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": rpc error: code = NotFound desc = could not find container \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": container with ID starting with d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.078746 4762 scope.go:117] "RemoveContainer" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079036 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7"} err="failed to get container status \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": rpc error: code = NotFound desc = could not find container \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": container with ID starting with 3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079171 4762 scope.go:117] "RemoveContainer" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079475 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1"} err="failed to get container status \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": rpc error: code = NotFound desc = could not find container \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": container with ID starting with 5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079496 4762 scope.go:117] "RemoveContainer" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079927 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0"} err="failed to get container status \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": rpc error: code = NotFound desc = could not find container \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": container with ID starting with d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.079950 4762 scope.go:117] "RemoveContainer" containerID="3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.080183 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7"} err="failed to get container status \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": rpc error: code = NotFound desc = could not find container \"3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7\": container with ID starting with 3433aeab88d78a370cdbbdb599bc9fa40cf92aa3f4d45dfc5c0bb128488caae7 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.080269 4762 scope.go:117] "RemoveContainer" containerID="5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.080717 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1"} err="failed to get container status \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": rpc error: code = NotFound desc = could not find container \"5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1\": container with ID starting with 5214ddb782d4ff20f202711ae60bd2bec495c1d83e6836e92a4de96446ff64a1 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.080746 4762 scope.go:117] "RemoveContainer" containerID="d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.081059 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0"} err="failed to get container status \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": rpc error: code = NotFound desc = could not find container \"d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0\": container with ID starting with d3a780b62eaf5f3f7b28db935af275ec9f10596720d91b49b8e1ad48e9f67db0 not found: ID does not exist" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.152182 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.152439 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.341152 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.346669 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.455962 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557364 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557459 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557551 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557572 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557593 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557599 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557691 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557639 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557714 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557664 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev" (OuterVolumeSpecName: "dev") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557797 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4bl\" (UniqueName: \"kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557837 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557858 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557894 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557912 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557937 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs\") pod \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\" (UID: \"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41\") " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557936 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys" (OuterVolumeSpecName: "sys") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.557952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run" (OuterVolumeSpecName: "run") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558122 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558398 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs" (OuterVolumeSpecName: "logs") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558483 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558494 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558504 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558512 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558523 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558532 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558539 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558550 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.558559 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.560734 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.561056 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl" (OuterVolumeSpecName: "kube-api-access-qd4bl") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "kube-api-access-qd4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.561594 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts" (OuterVolumeSpecName: "scripts") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.562225 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.615161 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data" (OuterVolumeSpecName: "config-data") pod "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" (UID: "f77f34ba-9c66-46ba-80ef-7f2a7ab61f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.659532 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.659566 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.659583 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.659597 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.659610 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4bl\" (UniqueName: \"kubernetes.io/projected/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41-kube-api-access-qd4bl\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.671799 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.671815 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.760588 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:20 crc kubenswrapper[4762]: I0217 18:08:20.760642 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.022586 4762 generic.go:334] "Generic (PLEG): container finished" podID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerID="a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670" exitCode=0 Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.022654 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.022654 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerDied","Data":"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670"} Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.022751 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"f77f34ba-9c66-46ba-80ef-7f2a7ab61f41","Type":"ContainerDied","Data":"56c1504f2dc330aeae28bad450e86ec359afa872bfd84500d5f337bdc8824cf4"} Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.022769 4762 scope.go:117] "RemoveContainer" containerID="a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.043142 4762 scope.go:117] "RemoveContainer" containerID="f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.064295 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" path="/var/lib/kubelet/pods/64958179-a093-4f81-a142-ae9b2f42b19c/volumes" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.072469 4762 scope.go:117] "RemoveContainer" containerID="61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.072682 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.079739 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.093880 4762 scope.go:117] "RemoveContainer" containerID="a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670" Feb 17 18:08:21 crc kubenswrapper[4762]: E0217 18:08:21.094550 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670\": container with ID starting with a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670 not found: ID does not exist" containerID="a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.094593 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670"} err="failed to get container status \"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670\": rpc error: code = NotFound desc = could not find container \"a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670\": container with ID starting with a63fcf68b2446526896c99308b6815677ee06e207fab8db6e12f0015c5cf0670 not found: ID does not exist" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.094639 4762 scope.go:117] "RemoveContainer" containerID="f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311" Feb 17 18:08:21 crc kubenswrapper[4762]: E0217 18:08:21.095102 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311\": container with ID starting with f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311 not found: ID does not exist" containerID="f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.095167 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311"} err="failed to get container status \"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311\": rpc error: code = NotFound desc = could not find container \"f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311\": container with ID starting with f019d44fed332de9c433acf805aad97e0e6c26b746d08fb598466ef6a3c0a311 not found: ID does not exist" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.095198 4762 scope.go:117] "RemoveContainer" containerID="61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99" Feb 17 18:08:21 crc kubenswrapper[4762]: E0217 18:08:21.095604 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99\": container with ID starting with 61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99 not found: ID does not exist" containerID="61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99" Feb 17 18:08:21 crc kubenswrapper[4762]: I0217 18:08:21.095639 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99"} err="failed to get container status \"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99\": rpc error: code = NotFound desc = could not find container \"61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99\": container with ID starting with 61bcbc9992830c8496a50cd0bacb94e3b84d2b5a5cf14a8bdb96395569a5ee99 not found: ID does not exist" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.263253 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g6bfq"] Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.272279 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g6bfq"] Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283148 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glanced4c8-account-delete-xptgj"] Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283461 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283488 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283500 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283510 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283524 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283532 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283542 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283549 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283560 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283567 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283577 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283587 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283601 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283610 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283639 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283647 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283663 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283670 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283681 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283688 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283699 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283706 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283717 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283725 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283734 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283741 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283753 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283762 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283776 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283782 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283791 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283797 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283810 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283815 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: E0217 18:08:22.283824 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283829 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283941 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283952 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283961 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283972 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283980 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283987 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.283995 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284003 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284012 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284021 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284026 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f95fec-67fe-4a75-9aed-6ac0944ea78e" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284032 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284038 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284044 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="64958179-a093-4f81-a142-ae9b2f42b19c" containerName="glance-api" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284053 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fcc6fe-4adb-4638-a9b4-bbe2b968954f" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284061 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284067 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f51072c-35bd-4c70-ac0a-307406b3dcc8" containerName="glance-log" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284073 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e363f68f-6964-4929-a32e-a0c55a4dabef" containerName="glance-httpd" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.284681 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.325320 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanced4c8-account-delete-xptgj"] Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.482029 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfzmd\" (UniqueName: \"kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.482120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.583250 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfzmd\" (UniqueName: \"kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.583336 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.584007 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.604709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfzmd\" (UniqueName: \"kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd\") pod \"glanced4c8-account-delete-xptgj\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:22 crc kubenswrapper[4762]: I0217 18:08:22.900616 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:23 crc kubenswrapper[4762]: I0217 18:08:23.048188 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99397293-7cbd-48dd-b637-0805fe66ddb8" path="/var/lib/kubelet/pods/99397293-7cbd-48dd-b637-0805fe66ddb8/volumes" Feb 17 18:08:23 crc kubenswrapper[4762]: I0217 18:08:23.053286 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77f34ba-9c66-46ba-80ef-7f2a7ab61f41" path="/var/lib/kubelet/pods/f77f34ba-9c66-46ba-80ef-7f2a7ab61f41/volumes" Feb 17 18:08:23 crc kubenswrapper[4762]: I0217 18:08:23.157894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanced4c8-account-delete-xptgj"] Feb 17 18:08:24 crc kubenswrapper[4762]: I0217 18:08:24.057682 4762 generic.go:334] "Generic (PLEG): container finished" podID="b9906e0e-0ad7-4699-99c3-5618dee779bf" containerID="a0ba6fbf6af590f2dc3c16f6fd2a262a84344179a208628745a3f7139e65ba9b" exitCode=0 Feb 17 18:08:24 crc kubenswrapper[4762]: I0217 18:08:24.057747 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" event={"ID":"b9906e0e-0ad7-4699-99c3-5618dee779bf","Type":"ContainerDied","Data":"a0ba6fbf6af590f2dc3c16f6fd2a262a84344179a208628745a3f7139e65ba9b"} Feb 17 18:08:24 crc kubenswrapper[4762]: I0217 18:08:24.057978 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" event={"ID":"b9906e0e-0ad7-4699-99c3-5618dee779bf","Type":"ContainerStarted","Data":"17fd2f3e12974c3b54191b8bbc8a61cee850db9284adeb0914ecec37f53b6f7f"} Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.342578 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.525263 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts\") pod \"b9906e0e-0ad7-4699-99c3-5618dee779bf\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.525553 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfzmd\" (UniqueName: \"kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd\") pod \"b9906e0e-0ad7-4699-99c3-5618dee779bf\" (UID: \"b9906e0e-0ad7-4699-99c3-5618dee779bf\") " Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.526652 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9906e0e-0ad7-4699-99c3-5618dee779bf" (UID: "b9906e0e-0ad7-4699-99c3-5618dee779bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.539482 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd" (OuterVolumeSpecName: "kube-api-access-dfzmd") pod "b9906e0e-0ad7-4699-99c3-5618dee779bf" (UID: "b9906e0e-0ad7-4699-99c3-5618dee779bf"). InnerVolumeSpecName "kube-api-access-dfzmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.627238 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9906e0e-0ad7-4699-99c3-5618dee779bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:25 crc kubenswrapper[4762]: I0217 18:08:25.627282 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfzmd\" (UniqueName: \"kubernetes.io/projected/b9906e0e-0ad7-4699-99c3-5618dee779bf-kube-api-access-dfzmd\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:26 crc kubenswrapper[4762]: I0217 18:08:26.071456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" event={"ID":"b9906e0e-0ad7-4699-99c3-5618dee779bf","Type":"ContainerDied","Data":"17fd2f3e12974c3b54191b8bbc8a61cee850db9284adeb0914ecec37f53b6f7f"} Feb 17 18:08:26 crc kubenswrapper[4762]: I0217 18:08:26.071497 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17fd2f3e12974c3b54191b8bbc8a61cee850db9284adeb0914ecec37f53b6f7f" Feb 17 18:08:26 crc kubenswrapper[4762]: I0217 18:08:26.071544 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanced4c8-account-delete-xptgj" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.308061 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-dcvjk"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.313275 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-dcvjk"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.333031 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glanced4c8-account-delete-xptgj"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.341460 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.349295 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glanced4c8-account-delete-xptgj"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.357297 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-d4c8-account-create-update-ntnlh"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.652059 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-xxctl"] Feb 17 18:08:27 crc kubenswrapper[4762]: E0217 18:08:27.652327 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9906e0e-0ad7-4699-99c3-5618dee779bf" containerName="mariadb-account-delete" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.652344 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9906e0e-0ad7-4699-99c3-5618dee779bf" containerName="mariadb-account-delete" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.652482 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9906e0e-0ad7-4699-99c3-5618dee779bf" containerName="mariadb-account-delete" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.652913 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.661358 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-xxctl"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.667855 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-aee5-account-create-update-8x8tp"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.668802 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.672835 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.679662 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-aee5-account-create-update-8x8tp"] Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.854177 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7nv7\" (UniqueName: \"kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.854229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g786c\" (UniqueName: \"kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.854265 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.854338 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.955829 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.955952 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7nv7\" (UniqueName: \"kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.955987 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g786c\" (UniqueName: \"kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.956018 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.956754 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.956761 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.988426 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7nv7\" (UniqueName: \"kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7\") pod \"glance-db-create-xxctl\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:27 crc kubenswrapper[4762]: I0217 18:08:27.994256 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g786c\" (UniqueName: \"kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c\") pod \"glance-aee5-account-create-update-8x8tp\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:28 crc kubenswrapper[4762]: I0217 18:08:28.268107 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:28 crc kubenswrapper[4762]: I0217 18:08:28.283872 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:28 crc kubenswrapper[4762]: I0217 18:08:28.511155 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-aee5-account-create-update-8x8tp"] Feb 17 18:08:28 crc kubenswrapper[4762]: I0217 18:08:28.706605 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-xxctl"] Feb 17 18:08:28 crc kubenswrapper[4762]: W0217 18:08:28.713233 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod359b218a_4867_47c2_ab60_f717d3105e86.slice/crio-2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b WatchSource:0}: Error finding container 2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b: Status 404 returned error can't find the container with id 2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.049000 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e4f362-7d19-48a7-a297-bae1fb8cdf8b" path="/var/lib/kubelet/pods/71e4f362-7d19-48a7-a297-bae1fb8cdf8b/volumes" Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.049741 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9906e0e-0ad7-4699-99c3-5618dee779bf" path="/var/lib/kubelet/pods/b9906e0e-0ad7-4699-99c3-5618dee779bf/volumes" Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.050386 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00b31dd-8e0a-40c6-8761-205f14bf1bde" path="/var/lib/kubelet/pods/c00b31dd-8e0a-40c6-8761-205f14bf1bde/volumes" Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.094834 4762 generic.go:334] "Generic (PLEG): container finished" podID="3736be6a-0bad-4095-bed7-301ba6790a21" containerID="7803c4d689dacc7d0a85e7769534ae2060fd784783d1b7844c4de1238094ae94" exitCode=0 Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.094940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" event={"ID":"3736be6a-0bad-4095-bed7-301ba6790a21","Type":"ContainerDied","Data":"7803c4d689dacc7d0a85e7769534ae2060fd784783d1b7844c4de1238094ae94"} Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.095002 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" event={"ID":"3736be6a-0bad-4095-bed7-301ba6790a21","Type":"ContainerStarted","Data":"f2aca50f4d7dfe86fba3b9d3641e90588973dd2b9c4340f9a42f34f83aed0be8"} Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.101766 4762 generic.go:334] "Generic (PLEG): container finished" podID="359b218a-4867-47c2-ab60-f717d3105e86" containerID="9b1036170c641db2caf2c5258948a281094ee79924851b898529cd3836fd63ed" exitCode=0 Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.101824 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xxctl" event={"ID":"359b218a-4867-47c2-ab60-f717d3105e86","Type":"ContainerDied","Data":"9b1036170c641db2caf2c5258948a281094ee79924851b898529cd3836fd63ed"} Feb 17 18:08:29 crc kubenswrapper[4762]: I0217 18:08:29.101858 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xxctl" event={"ID":"359b218a-4867-47c2-ab60-f717d3105e86","Type":"ContainerStarted","Data":"2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b"} Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.420362 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.425191 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.589920 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7nv7\" (UniqueName: \"kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7\") pod \"359b218a-4867-47c2-ab60-f717d3105e86\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.590208 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g786c\" (UniqueName: \"kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c\") pod \"3736be6a-0bad-4095-bed7-301ba6790a21\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.590325 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts\") pod \"359b218a-4867-47c2-ab60-f717d3105e86\" (UID: \"359b218a-4867-47c2-ab60-f717d3105e86\") " Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.590414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts\") pod \"3736be6a-0bad-4095-bed7-301ba6790a21\" (UID: \"3736be6a-0bad-4095-bed7-301ba6790a21\") " Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.590827 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3736be6a-0bad-4095-bed7-301ba6790a21" (UID: "3736be6a-0bad-4095-bed7-301ba6790a21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.591015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "359b218a-4867-47c2-ab60-f717d3105e86" (UID: "359b218a-4867-47c2-ab60-f717d3105e86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.595474 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c" (OuterVolumeSpecName: "kube-api-access-g786c") pod "3736be6a-0bad-4095-bed7-301ba6790a21" (UID: "3736be6a-0bad-4095-bed7-301ba6790a21"). InnerVolumeSpecName "kube-api-access-g786c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.600782 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7" (OuterVolumeSpecName: "kube-api-access-r7nv7") pod "359b218a-4867-47c2-ab60-f717d3105e86" (UID: "359b218a-4867-47c2-ab60-f717d3105e86"). InnerVolumeSpecName "kube-api-access-r7nv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.692255 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359b218a-4867-47c2-ab60-f717d3105e86-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.692299 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3736be6a-0bad-4095-bed7-301ba6790a21-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.692313 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7nv7\" (UniqueName: \"kubernetes.io/projected/359b218a-4867-47c2-ab60-f717d3105e86-kube-api-access-r7nv7\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:30 crc kubenswrapper[4762]: I0217 18:08:30.692327 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g786c\" (UniqueName: \"kubernetes.io/projected/3736be6a-0bad-4095-bed7-301ba6790a21-kube-api-access-g786c\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.118205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" event={"ID":"3736be6a-0bad-4095-bed7-301ba6790a21","Type":"ContainerDied","Data":"f2aca50f4d7dfe86fba3b9d3641e90588973dd2b9c4340f9a42f34f83aed0be8"} Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.118232 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-aee5-account-create-update-8x8tp" Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.118243 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2aca50f4d7dfe86fba3b9d3641e90588973dd2b9c4340f9a42f34f83aed0be8" Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.119844 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xxctl" event={"ID":"359b218a-4867-47c2-ab60-f717d3105e86","Type":"ContainerDied","Data":"2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b"} Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.119869 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a10abd9feed4e2e9ff9274bf24183e22642c5a0d88fdb6b67c22b86d861158b" Feb 17 18:08:31 crc kubenswrapper[4762]: I0217 18:08:31.119989 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xxctl" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.786214 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-c4hz2"] Feb 17 18:08:32 crc kubenswrapper[4762]: E0217 18:08:32.787580 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3736be6a-0bad-4095-bed7-301ba6790a21" containerName="mariadb-account-create-update" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.787701 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3736be6a-0bad-4095-bed7-301ba6790a21" containerName="mariadb-account-create-update" Feb 17 18:08:32 crc kubenswrapper[4762]: E0217 18:08:32.787764 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359b218a-4867-47c2-ab60-f717d3105e86" containerName="mariadb-database-create" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.787814 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="359b218a-4867-47c2-ab60-f717d3105e86" containerName="mariadb-database-create" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.787980 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3736be6a-0bad-4095-bed7-301ba6790a21" containerName="mariadb-account-create-update" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.788037 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="359b218a-4867-47c2-ab60-f717d3105e86" containerName="mariadb-database-create" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.788503 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.791151 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.791508 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-lrmhq" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.797605 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-c4hz2"] Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.926275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.926381 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjzs\" (UniqueName: \"kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:32 crc kubenswrapper[4762]: I0217 18:08:32.926440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.028351 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.028576 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjzs\" (UniqueName: \"kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.028716 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.033772 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.056539 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.060731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjzs\" (UniqueName: \"kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs\") pod \"glance-db-sync-c4hz2\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.103512 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:33 crc kubenswrapper[4762]: I0217 18:08:33.537310 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-c4hz2"] Feb 17 18:08:33 crc kubenswrapper[4762]: W0217 18:08:33.549912 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45cef97c_2248_4f8d_9e35_0f8d9db0e6be.slice/crio-ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65 WatchSource:0}: Error finding container ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65: Status 404 returned error can't find the container with id ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65 Feb 17 18:08:34 crc kubenswrapper[4762]: I0217 18:08:34.138108 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-c4hz2" event={"ID":"45cef97c-2248-4f8d-9e35-0f8d9db0e6be","Type":"ContainerStarted","Data":"510ea20aff458e12a1c9f80ec2d286bfe370ed701ead3fe9bb442614a8bf9a73"} Feb 17 18:08:34 crc kubenswrapper[4762]: I0217 18:08:34.138463 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-c4hz2" event={"ID":"45cef97c-2248-4f8d-9e35-0f8d9db0e6be","Type":"ContainerStarted","Data":"ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65"} Feb 17 18:08:34 crc kubenswrapper[4762]: I0217 18:08:34.158280 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-c4hz2" podStartSLOduration=2.158263136 podStartE2EDuration="2.158263136s" podCreationTimestamp="2026-02-17 18:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:34.153188012 +0000 UTC m=+1265.798106052" watchObservedRunningTime="2026-02-17 18:08:34.158263136 +0000 UTC m=+1265.803181146" Feb 17 18:08:37 crc kubenswrapper[4762]: I0217 18:08:37.169565 4762 generic.go:334] "Generic (PLEG): container finished" podID="45cef97c-2248-4f8d-9e35-0f8d9db0e6be" containerID="510ea20aff458e12a1c9f80ec2d286bfe370ed701ead3fe9bb442614a8bf9a73" exitCode=0 Feb 17 18:08:37 crc kubenswrapper[4762]: I0217 18:08:37.169656 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-c4hz2" event={"ID":"45cef97c-2248-4f8d-9e35-0f8d9db0e6be","Type":"ContainerDied","Data":"510ea20aff458e12a1c9f80ec2d286bfe370ed701ead3fe9bb442614a8bf9a73"} Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.534503 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.710985 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data\") pod \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.711105 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data\") pod \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.711159 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjzs\" (UniqueName: \"kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs\") pod \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\" (UID: \"45cef97c-2248-4f8d-9e35-0f8d9db0e6be\") " Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.718424 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs" (OuterVolumeSpecName: "kube-api-access-vsjzs") pod "45cef97c-2248-4f8d-9e35-0f8d9db0e6be" (UID: "45cef97c-2248-4f8d-9e35-0f8d9db0e6be"). InnerVolumeSpecName "kube-api-access-vsjzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.718579 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "45cef97c-2248-4f8d-9e35-0f8d9db0e6be" (UID: "45cef97c-2248-4f8d-9e35-0f8d9db0e6be"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.756512 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data" (OuterVolumeSpecName: "config-data") pod "45cef97c-2248-4f8d-9e35-0f8d9db0e6be" (UID: "45cef97c-2248-4f8d-9e35-0f8d9db0e6be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.812645 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.812691 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjzs\" (UniqueName: \"kubernetes.io/projected/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-kube-api-access-vsjzs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:38 crc kubenswrapper[4762]: I0217 18:08:38.812732 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/45cef97c-2248-4f8d-9e35-0f8d9db0e6be-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:39 crc kubenswrapper[4762]: I0217 18:08:39.185616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-c4hz2" event={"ID":"45cef97c-2248-4f8d-9e35-0f8d9db0e6be","Type":"ContainerDied","Data":"ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65"} Feb 17 18:08:39 crc kubenswrapper[4762]: I0217 18:08:39.185954 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef5b25b390cc646138a640ff1d24553b4fcc94eed7241dc4afa1e085ac1abc65" Feb 17 18:08:39 crc kubenswrapper[4762]: I0217 18:08:39.185706 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-c4hz2" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.265241 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:40 crc kubenswrapper[4762]: E0217 18:08:40.265755 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cef97c-2248-4f8d-9e35-0f8d9db0e6be" containerName="glance-db-sync" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.265774 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cef97c-2248-4f8d-9e35-0f8d9db0e6be" containerName="glance-db-sync" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.266005 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cef97c-2248-4f8d-9e35-0f8d9db0e6be" containerName="glance-db-sync" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.267120 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.270061 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.270298 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.270474 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-lrmhq" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.291400 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438112 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438777 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438826 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438906 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438928 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438944 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.438974 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.439055 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glz9\" (UniqueName: \"kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.540928 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.540987 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541010 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541078 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541089 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541115 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541155 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glz9\" (UniqueName: \"kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541208 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541266 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541447 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541518 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541560 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541663 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541705 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541772 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541794 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541794 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.541897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.542612 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.547112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.550932 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.560826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glz9\" (UniqueName: \"kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.570409 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.594890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.774913 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.776261 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.784145 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.797889 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.884034 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.953874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.954711 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.954798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.954834 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.954930 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.954980 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mdlh\" (UniqueName: \"kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955014 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955054 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955116 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955155 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955187 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:40 crc kubenswrapper[4762]: I0217 18:08:40.955284 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.056872 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057202 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057856 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057938 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.057958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.058023 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.059724 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.060442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.060461 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.060505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.060981 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.060985 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.061063 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.061114 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.061416 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.061547 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.061798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mdlh\" (UniqueName: \"kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.062466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.062941 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.062840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.063162 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.063078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.064073 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.064278 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.064835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.081140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mdlh\" (UniqueName: \"kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.094074 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.097280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.233736 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.234337 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.359825 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:08:41 crc kubenswrapper[4762]: I0217 18:08:41.669101 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:41 crc kubenswrapper[4762]: W0217 18:08:41.670880 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ec36cc_0dd8_4298_b37b_71022f74797b.slice/crio-90af44325b0eb3d9e5288896849dbb2aa75289b345548c1aa9821fad424c74e1 WatchSource:0}: Error finding container 90af44325b0eb3d9e5288896849dbb2aa75289b345548c1aa9821fad424c74e1: Status 404 returned error can't find the container with id 90af44325b0eb3d9e5288896849dbb2aa75289b345548c1aa9821fad424c74e1 Feb 17 18:08:42 crc kubenswrapper[4762]: E0217 18:08:42.063294 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-storage-0" podUID="ae866fa5-748d-4935-a3d2-2fe08bc9693f" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.207906 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerStarted","Data":"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.207970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerStarted","Data":"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.207994 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerStarted","Data":"55966b6e0c9ca648e148698cf8964273fc0ff72d9da178757a6d998a6cb1b449"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.210900 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.211987 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-log" containerID="cri-o://0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" gracePeriod=30 Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.212381 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerStarted","Data":"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.212464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerStarted","Data":"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.212479 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerStarted","Data":"90af44325b0eb3d9e5288896849dbb2aa75289b345548c1aa9821fad424c74e1"} Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.212565 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-httpd" containerID="cri-o://11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" gracePeriod=30 Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.245480 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.2454581510000002 podStartE2EDuration="2.245458151s" podCreationTimestamp="2026-02-17 18:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:42.233724767 +0000 UTC m=+1273.878642797" watchObservedRunningTime="2026-02-17 18:08:42.245458151 +0000 UTC m=+1273.890376161" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.262281 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.262258099 podStartE2EDuration="3.262258099s" podCreationTimestamp="2026-02-17 18:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:42.262214838 +0000 UTC m=+1273.907132888" watchObservedRunningTime="2026-02-17 18:08:42.262258099 +0000 UTC m=+1273.907176139" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.548123 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692258 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692892 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692918 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692956 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692754 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs" (OuterVolumeSpecName: "logs") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692939 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692965 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run" (OuterVolumeSpecName: "run") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.692986 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693053 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys" (OuterVolumeSpecName: "sys") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693066 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mdlh\" (UniqueName: \"kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693149 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693180 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693222 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693251 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693260 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev" (OuterVolumeSpecName: "dev") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693337 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693359 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693392 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693686 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick\") pod \"e8ec36cc-0dd8-4298-b37b-71022f74797b\" (UID: \"e8ec36cc-0dd8-4298-b37b-71022f74797b\") " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693704 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.693771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694506 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694532 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694542 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694555 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec36cc-0dd8-4298-b37b-71022f74797b-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694566 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694580 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694589 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694599 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.694608 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8ec36cc-0dd8-4298-b37b-71022f74797b-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.698652 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts" (OuterVolumeSpecName: "scripts") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.698810 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.698951 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh" (OuterVolumeSpecName: "kube-api-access-9mdlh") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "kube-api-access-9mdlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.701767 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.730697 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data" (OuterVolumeSpecName: "config-data") pod "e8ec36cc-0dd8-4298-b37b-71022f74797b" (UID: "e8ec36cc-0dd8-4298-b37b-71022f74797b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.796467 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.796511 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.796529 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mdlh\" (UniqueName: \"kubernetes.io/projected/e8ec36cc-0dd8-4298-b37b-71022f74797b-kube-api-access-9mdlh\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.796544 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.796556 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec36cc-0dd8-4298-b37b-71022f74797b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.808785 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.818527 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.897607 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:42 crc kubenswrapper[4762]: I0217 18:08:42.897659 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219009 4762 generic.go:334] "Generic (PLEG): container finished" podID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerID="11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" exitCode=143 Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219359 4762 generic.go:334] "Generic (PLEG): container finished" podID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerID="0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" exitCode=143 Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219072 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219101 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerDied","Data":"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94"} Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219442 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerDied","Data":"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400"} Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219466 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8ec36cc-0dd8-4298-b37b-71022f74797b","Type":"ContainerDied","Data":"90af44325b0eb3d9e5288896849dbb2aa75289b345548c1aa9821fad424c74e1"} Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.219485 4762 scope.go:117] "RemoveContainer" containerID="11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.240027 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.241285 4762 scope.go:117] "RemoveContainer" containerID="0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.250067 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.263088 4762 scope.go:117] "RemoveContainer" containerID="11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" Feb 17 18:08:43 crc kubenswrapper[4762]: E0217 18:08:43.263523 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94\": container with ID starting with 11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94 not found: ID does not exist" containerID="11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.263553 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94"} err="failed to get container status \"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94\": rpc error: code = NotFound desc = could not find container \"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94\": container with ID starting with 11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94 not found: ID does not exist" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.263606 4762 scope.go:117] "RemoveContainer" containerID="0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.263735 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:43 crc kubenswrapper[4762]: E0217 18:08:43.264016 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400\": container with ID starting with 0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400 not found: ID does not exist" containerID="0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" Feb 17 18:08:43 crc kubenswrapper[4762]: E0217 18:08:43.264068 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-httpd" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264082 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-httpd" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264069 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400"} err="failed to get container status \"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400\": rpc error: code = NotFound desc = could not find container \"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400\": container with ID starting with 0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400 not found: ID does not exist" Feb 17 18:08:43 crc kubenswrapper[4762]: E0217 18:08:43.264108 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-log" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264116 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-log" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264108 4762 scope.go:117] "RemoveContainer" containerID="11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264260 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-log" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264280 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" containerName="glance-httpd" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264561 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94"} err="failed to get container status \"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94\": rpc error: code = NotFound desc = could not find container \"11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94\": container with ID starting with 11f7be9cfe10550a2dbce081d227bfcb830e8e1f5f1bfb2472fa33e94508eb94 not found: ID does not exist" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.264601 4762 scope.go:117] "RemoveContainer" containerID="0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.265071 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.265152 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400"} err="failed to get container status \"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400\": rpc error: code = NotFound desc = could not find container \"0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400\": container with ID starting with 0f4a1126f2485301e82b86c2491d536bf09cd9a294fc810aaff4894512623400 not found: ID does not exist" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.267122 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.290714 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.406680 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbt7\" (UniqueName: \"kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.406822 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.406852 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.406963 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.406999 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407035 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407053 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407071 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407110 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407124 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407150 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407179 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.407244 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508476 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508533 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508584 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508605 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508660 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508684 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508705 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508721 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508775 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508837 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.508907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbt7\" (UniqueName: \"kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509320 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509501 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509669 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509889 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.509948 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.510069 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.510114 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.510437 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.510570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.515402 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.516978 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.535419 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.538144 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.542363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbt7\" (UniqueName: \"kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7\") pod \"glance-default-internal-api-0\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.579208 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:43 crc kubenswrapper[4762]: I0217 18:08:43.995433 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:08:44 crc kubenswrapper[4762]: I0217 18:08:44.228650 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerStarted","Data":"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401"} Feb 17 18:08:44 crc kubenswrapper[4762]: I0217 18:08:44.229072 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerStarted","Data":"885ca62993dfe0a19a6385ed53cf93f8667c491f6ee2f320d502a15e0e10d4b2"} Feb 17 18:08:45 crc kubenswrapper[4762]: I0217 18:08:45.051082 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ec36cc-0dd8-4298-b37b-71022f74797b" path="/var/lib/kubelet/pods/e8ec36cc-0dd8-4298-b37b-71022f74797b/volumes" Feb 17 18:08:45 crc kubenswrapper[4762]: E0217 18:08:45.081378 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" podUID="e576e3fe-21e1-4867-adcc-bb586e3a5921" Feb 17 18:08:45 crc kubenswrapper[4762]: I0217 18:08:45.241068 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:08:45 crc kubenswrapper[4762]: I0217 18:08:45.241057 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerStarted","Data":"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed"} Feb 17 18:08:45 crc kubenswrapper[4762]: I0217 18:08:45.283366 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.283340655 podStartE2EDuration="2.283340655s" podCreationTimestamp="2026-02-17 18:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:08:45.263256883 +0000 UTC m=+1276.908174903" watchObservedRunningTime="2026-02-17 18:08:45.283340655 +0000 UTC m=+1276.928258675" Feb 17 18:08:46 crc kubenswrapper[4762]: E0217 18:08:46.049273 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:08:46 crc kubenswrapper[4762]: E0217 18:08:46.049805 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:08:46 crc kubenswrapper[4762]: E0217 18:08:46.049890 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:10:48.049865247 +0000 UTC m=+1399.694783297 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:08:46 crc kubenswrapper[4762]: I0217 18:08:46.050600 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:08:48 crc kubenswrapper[4762]: I0217 18:08:48.495600 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:08:48 crc kubenswrapper[4762]: E0217 18:08:48.495798 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:08:48 crc kubenswrapper[4762]: E0217 18:08:48.495820 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:08:48 crc kubenswrapper[4762]: E0217 18:08:48.495891 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:10:50.495868498 +0000 UTC m=+1402.140786518 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:08:50 crc kubenswrapper[4762]: I0217 18:08:50.885051 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:50 crc kubenswrapper[4762]: I0217 18:08:50.885693 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:50 crc kubenswrapper[4762]: I0217 18:08:50.908527 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:50 crc kubenswrapper[4762]: I0217 18:08:50.923892 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:51 crc kubenswrapper[4762]: I0217 18:08:51.293037 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:51 crc kubenswrapper[4762]: I0217 18:08:51.293091 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.262733 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.299257 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.580137 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.580516 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.613323 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:53 crc kubenswrapper[4762]: I0217 18:08:53.628481 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:54 crc kubenswrapper[4762]: I0217 18:08:54.322722 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:54 crc kubenswrapper[4762]: I0217 18:08:54.322802 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:56 crc kubenswrapper[4762]: I0217 18:08:56.341324 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:08:56 crc kubenswrapper[4762]: I0217 18:08:56.341870 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:08:56 crc kubenswrapper[4762]: I0217 18:08:56.405681 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:56 crc kubenswrapper[4762]: I0217 18:08:56.412029 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.576339 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.579041 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.587812 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.589533 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.598430 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.608958 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672547 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672672 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672698 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672725 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672760 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rdh\" (UniqueName: \"kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672823 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672846 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672872 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672916 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672935 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672955 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.672984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4q5v\" (UniqueName: \"kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673045 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673065 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673087 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673106 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673189 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673238 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673293 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.673314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.718149 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.721474 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.726239 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.727357 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.735540 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.742898 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774817 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774896 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774931 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.774947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775009 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775060 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775197 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775262 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rdh\" (UniqueName: \"kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775332 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775369 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndng7\" (UniqueName: \"kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775467 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775546 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775607 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4q5v\" (UniqueName: \"kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775705 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775738 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775768 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775817 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775893 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.775980 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776005 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776029 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776052 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776119 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776130 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776160 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776183 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776216 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776231 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776304 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776351 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776377 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776400 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776412 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776851 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776936 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776944 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.776995 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.777024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.777041 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.777052 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.777077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.777216 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.785976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.786278 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.795308 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4q5v\" (UniqueName: \"kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.795833 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.796139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.802613 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rdh\" (UniqueName: \"kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.810474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.817002 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.822605 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.828466 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878219 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndng7\" (UniqueName: \"kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878572 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878597 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878617 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878702 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878718 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878758 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878784 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878808 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7cq\" (UniqueName: \"kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878842 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878915 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878937 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878942 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.878953 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879009 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879099 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879142 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879171 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879199 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879233 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879271 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879367 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879379 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879548 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879586 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879664 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879686 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.879706 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.880019 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.884597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.885506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.899929 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndng7\" (UniqueName: \"kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.902701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.903222 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.903927 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-1\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.907052 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.915829 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.981557 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.981607 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.981824 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.981898 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.981966 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.982004 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7cq\" (UniqueName: \"kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.982018 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.982031 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.982788 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983178 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983215 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983262 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983283 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983304 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983453 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983479 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983523 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983574 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983647 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.983660 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.984971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.985191 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.990042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.990431 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:08:59 crc kubenswrapper[4762]: I0217 18:08:59.998672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7cq\" (UniqueName: \"kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.006040 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.045682 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.055840 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.344784 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:09:00 crc kubenswrapper[4762]: W0217 18:09:00.350022 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7d89a9_6206_4ffe_bd91_a407c2066f23.slice/crio-fcf1be0f3832a852b1be6b3719102880be056aa420cde124257f9bb7b49c1bd7 WatchSource:0}: Error finding container fcf1be0f3832a852b1be6b3719102880be056aa420cde124257f9bb7b49c1bd7: Status 404 returned error can't find the container with id fcf1be0f3832a852b1be6b3719102880be056aa420cde124257f9bb7b49c1bd7 Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.372263 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerStarted","Data":"fcf1be0f3832a852b1be6b3719102880be056aa420cde124257f9bb7b49c1bd7"} Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.397953 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.488081 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:09:00 crc kubenswrapper[4762]: I0217 18:09:00.494905 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:09:00 crc kubenswrapper[4762]: W0217 18:09:00.505756 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116a936d_e7fd_4c06_b3a0_1b56d0f7f30c.slice/crio-5dca558fff76c657eff8c627b17387aa664cc3f339960e8a8058232ab0d2fbcc WatchSource:0}: Error finding container 5dca558fff76c657eff8c627b17387aa664cc3f339960e8a8058232ab0d2fbcc: Status 404 returned error can't find the container with id 5dca558fff76c657eff8c627b17387aa664cc3f339960e8a8058232ab0d2fbcc Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.382035 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerStarted","Data":"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.382608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerStarted","Data":"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.382659 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerStarted","Data":"c4050882ded574a05efa46436e6bac4dc451279895662cad98af26125eb96827"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.384205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerStarted","Data":"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.384237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerStarted","Data":"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.384253 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerStarted","Data":"5dca558fff76c657eff8c627b17387aa664cc3f339960e8a8058232ab0d2fbcc"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.387104 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerStarted","Data":"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.387152 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerStarted","Data":"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.387167 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerStarted","Data":"216bd0ebdecf38b4ceeb699a485c3ec4ea153cfed3aec9b8dfddaf94da103cb1"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.400906 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerStarted","Data":"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.400951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerStarted","Data":"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7"} Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.425992 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.425919412 podStartE2EDuration="3.425919412s" podCreationTimestamp="2026-02-17 18:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:01.407242511 +0000 UTC m=+1293.052160531" watchObservedRunningTime="2026-02-17 18:09:01.425919412 +0000 UTC m=+1293.070837442" Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.437862 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.437844932 podStartE2EDuration="3.437844932s" podCreationTimestamp="2026-02-17 18:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:01.437154822 +0000 UTC m=+1293.082072842" watchObservedRunningTime="2026-02-17 18:09:01.437844932 +0000 UTC m=+1293.082762942" Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.474809 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.474789413 podStartE2EDuration="3.474789413s" podCreationTimestamp="2026-02-17 18:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:01.464816029 +0000 UTC m=+1293.109734039" watchObservedRunningTime="2026-02-17 18:09:01.474789413 +0000 UTC m=+1293.119707423" Feb 17 18:09:01 crc kubenswrapper[4762]: I0217 18:09:01.495258 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.495234945 podStartE2EDuration="3.495234945s" podCreationTimestamp="2026-02-17 18:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:01.4880272 +0000 UTC m=+1293.132945230" watchObservedRunningTime="2026-02-17 18:09:01.495234945 +0000 UTC m=+1293.140152955" Feb 17 18:09:04 crc kubenswrapper[4762]: I0217 18:09:04.559056 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:09:04 crc kubenswrapper[4762]: I0217 18:09:04.559704 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.908001 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.908587 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.916227 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.916271 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.933670 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.949649 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.960057 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:09 crc kubenswrapper[4762]: I0217 18:09:09.971026 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.047343 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.047669 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.056911 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.057503 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.071312 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.085484 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.086124 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.095196 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.470669 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.470930 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.470969 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.471962 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.472009 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.472024 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.472038 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:10 crc kubenswrapper[4762]: I0217 18:09:10.472053 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.445973 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.451109 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.490311 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.490346 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.627384 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.627492 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.648937 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.664508 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.664642 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.790184 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.794235 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:12 crc kubenswrapper[4762]: I0217 18:09:12.907185 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:13 crc kubenswrapper[4762]: I0217 18:09:13.854194 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:09:13 crc kubenswrapper[4762]: I0217 18:09:13.866175 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.039135 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.048843 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.506418 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-log" containerID="cri-o://19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.506514 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-httpd" containerID="cri-o://7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.506921 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-log" containerID="cri-o://075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.506971 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-httpd" containerID="cri-o://42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.507069 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-log" containerID="cri-o://86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.507146 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-httpd" containerID="cri-o://52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe" gracePeriod=30 Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.515774 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.128:9292/healthcheck\": EOF" Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.517249 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.128:9292/healthcheck\": EOF" Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.524332 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.130:9292/healthcheck\": EOF" Feb 17 18:09:14 crc kubenswrapper[4762]: I0217 18:09:14.528750 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.130:9292/healthcheck\": EOF" Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.516348 4762 generic.go:334] "Generic (PLEG): container finished" podID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerID="19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d" exitCode=143 Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.516421 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerDied","Data":"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d"} Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.518932 4762 generic.go:334] "Generic (PLEG): container finished" podID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerID="86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae" exitCode=143 Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.518983 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerDied","Data":"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae"} Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.521680 4762 generic.go:334] "Generic (PLEG): container finished" podID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerID="075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7" exitCode=143 Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.521713 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerDied","Data":"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7"} Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.521934 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-log" containerID="cri-o://cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a" gracePeriod=30 Feb 17 18:09:15 crc kubenswrapper[4762]: I0217 18:09:15.522001 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-httpd" containerID="cri-o://4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37" gracePeriod=30 Feb 17 18:09:16 crc kubenswrapper[4762]: I0217 18:09:16.536889 4762 generic.go:334] "Generic (PLEG): container finished" podID="e1efdbe3-5844-4335-959d-378864298dc1" containerID="cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a" exitCode=143 Feb 17 18:09:16 crc kubenswrapper[4762]: I0217 18:09:16.536976 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerDied","Data":"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a"} Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.080133 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148425 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4q5v\" (UniqueName: \"kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148508 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148529 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148551 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148660 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148682 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148694 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148714 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148761 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148835 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148778 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run" (OuterVolumeSpecName: "run") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148797 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148796 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev" (OuterVolumeSpecName: "dev") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148876 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148902 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148926 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148955 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.148974 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.149000 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run\") pod \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\" (UID: \"1c7d89a9-6206-4ffe-bd91-a407c2066f23\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.149084 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs" (OuterVolumeSpecName: "logs") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.149142 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys" (OuterVolumeSpecName: "sys") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.149314 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.149672 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151291 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151314 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151325 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151334 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151365 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151375 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151385 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7d89a9-6206-4ffe-bd91-a407c2066f23-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151394 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.151403 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7d89a9-6206-4ffe-bd91-a407c2066f23-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.153508 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.158877 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.159065 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v" (OuterVolumeSpecName: "kube-api-access-r4q5v") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "kube-api-access-r4q5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.161790 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts" (OuterVolumeSpecName: "scripts") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.187038 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data" (OuterVolumeSpecName: "config-data") pod "1c7d89a9-6206-4ffe-bd91-a407c2066f23" (UID: "1c7d89a9-6206-4ffe-bd91-a407c2066f23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.207722 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.253289 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.253640 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c7d89a9-6206-4ffe-bd91-a407c2066f23-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.253674 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.253691 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.253702 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4q5v\" (UniqueName: \"kubernetes.io/projected/1c7d89a9-6206-4ffe-bd91-a407c2066f23-kube-api-access-r4q5v\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.265810 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.269219 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.355583 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.355979 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356012 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356053 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356079 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356098 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356142 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356168 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356166 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356216 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run" (OuterVolumeSpecName: "run") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356193 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356376 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356415 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356478 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356510 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356535 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndng7\" (UniqueName: \"kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7\") pod \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\" (UID: \"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c\") " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356179 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys" (OuterVolumeSpecName: "sys") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356528 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356555 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs" (OuterVolumeSpecName: "logs") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356856 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.356879 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev" (OuterVolumeSpecName: "dev") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357180 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357196 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357206 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357214 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357222 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357231 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357239 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357249 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357257 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.357265 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.358911 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.359444 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.359466 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.359482 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts" (OuterVolumeSpecName: "scripts") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.360555 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7" (OuterVolumeSpecName: "kube-api-access-ndng7") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "kube-api-access-ndng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.392296 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data" (OuterVolumeSpecName: "config-data") pod "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" (UID: "116a936d-e7fd-4c06-b3a0-1b56d0f7f30c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459322 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459367 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459383 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndng7\" (UniqueName: \"kubernetes.io/projected/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-kube-api-access-ndng7\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459420 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459434 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.459452 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.478562 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.479261 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.555539 4762 generic.go:334] "Generic (PLEG): container finished" podID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerID="42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce" exitCode=0 Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.555617 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.555710 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerDied","Data":"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce"} Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.555774 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"1c7d89a9-6206-4ffe-bd91-a407c2066f23","Type":"ContainerDied","Data":"fcf1be0f3832a852b1be6b3719102880be056aa420cde124257f9bb7b49c1bd7"} Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.555799 4762 scope.go:117] "RemoveContainer" containerID="42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.557555 4762 generic.go:334] "Generic (PLEG): container finished" podID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerID="52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe" exitCode=0 Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.557600 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerDied","Data":"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe"} Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.557637 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.557640 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"116a936d-e7fd-4c06-b3a0-1b56d0f7f30c","Type":"ContainerDied","Data":"5dca558fff76c657eff8c627b17387aa664cc3f339960e8a8058232ab0d2fbcc"} Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.561251 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.561483 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.580854 4762 scope.go:117] "RemoveContainer" containerID="075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.606957 4762 scope.go:117] "RemoveContainer" containerID="42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.609916 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:09:18 crc kubenswrapper[4762]: E0217 18:09:18.611104 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce\": container with ID starting with 42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce not found: ID does not exist" containerID="42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.611159 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce"} err="failed to get container status \"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce\": rpc error: code = NotFound desc = could not find container \"42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce\": container with ID starting with 42946b2edcffab970c7df627d3b6f2ac8fafe365887d368ca4f130e644864dce not found: ID does not exist" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.611186 4762 scope.go:117] "RemoveContainer" containerID="075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7" Feb 17 18:09:18 crc kubenswrapper[4762]: E0217 18:09:18.611485 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7\": container with ID starting with 075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7 not found: ID does not exist" containerID="075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.611522 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7"} err="failed to get container status \"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7\": rpc error: code = NotFound desc = could not find container \"075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7\": container with ID starting with 075659718f627426d54bb3ac7a55d1dc7e32cf8c90d02eb8c39becddb03566b7 not found: ID does not exist" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.611542 4762 scope.go:117] "RemoveContainer" containerID="52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.651173 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.651409 4762 scope.go:117] "RemoveContainer" containerID="86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.656307 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.665815 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.725390 4762 scope.go:117] "RemoveContainer" containerID="52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe" Feb 17 18:09:18 crc kubenswrapper[4762]: E0217 18:09:18.725843 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe\": container with ID starting with 52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe not found: ID does not exist" containerID="52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.725883 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe"} err="failed to get container status \"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe\": rpc error: code = NotFound desc = could not find container \"52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe\": container with ID starting with 52d8ae51ee1350accba1b410c32d893db916a21ec72416d5a0e02df34a291dfe not found: ID does not exist" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.725913 4762 scope.go:117] "RemoveContainer" containerID="86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae" Feb 17 18:09:18 crc kubenswrapper[4762]: E0217 18:09:18.726173 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae\": container with ID starting with 86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae not found: ID does not exist" containerID="86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae" Feb 17 18:09:18 crc kubenswrapper[4762]: I0217 18:09:18.726235 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae"} err="failed to get container status \"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae\": rpc error: code = NotFound desc = could not find container \"86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae\": container with ID starting with 86d726df920f935d4f20958796c0f268b1b3571cb904bed6da426c4cb36340ae not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.046379 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" path="/var/lib/kubelet/pods/116a936d-e7fd-4c06-b3a0-1b56d0f7f30c/volumes" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.047027 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" path="/var/lib/kubelet/pods/1c7d89a9-6206-4ffe-bd91-a407c2066f23/volumes" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.117865 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276105 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276392 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276495 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276485 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run" (OuterVolumeSpecName: "run") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276518 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276631 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276665 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276796 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276847 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7cq\" (UniqueName: \"kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276876 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev" (OuterVolumeSpecName: "dev") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276926 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.276992 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277070 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277101 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules\") pod \"e1efdbe3-5844-4335-959d-378864298dc1\" (UID: \"e1efdbe3-5844-4335-959d-378864298dc1\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277222 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277270 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys" (OuterVolumeSpecName: "sys") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277298 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs" (OuterVolumeSpecName: "logs") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277841 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277892 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277915 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277917 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277926 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277938 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1efdbe3-5844-4335-959d-378864298dc1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277970 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.277983 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.280222 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.280641 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.281084 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq" (OuterVolumeSpecName: "kube-api-access-hm7cq") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "kube-api-access-hm7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.282315 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts" (OuterVolumeSpecName: "scripts") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.301134 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.317944 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data" (OuterVolumeSpecName: "config-data") pod "e1efdbe3-5844-4335-959d-378864298dc1" (UID: "e1efdbe3-5844-4335-959d-378864298dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378532 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9rdh\" (UniqueName: \"kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378597 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378649 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378671 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378716 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378783 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378777 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378850 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378871 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378874 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378926 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys" (OuterVolumeSpecName: "sys") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378898 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378950 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev" (OuterVolumeSpecName: "dev") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378968 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.378992 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379009 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379014 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme\") pod \"647ea438-8124-4d89-8186-4a97f5fcc48f\" (UID: \"647ea438-8124-4d89-8186-4a97f5fcc48f\") " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379048 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs" (OuterVolumeSpecName: "logs") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379251 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run" (OuterVolumeSpecName: "run") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379307 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379514 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379528 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379542 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379549 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379579 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379587 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379596 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379604 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379633 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379642 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379654 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7cq\" (UniqueName: \"kubernetes.io/projected/e1efdbe3-5844-4335-959d-378864298dc1-kube-api-access-hm7cq\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379663 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379671 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/647ea438-8124-4d89-8186-4a97f5fcc48f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379679 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1efdbe3-5844-4335-959d-378864298dc1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379686 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379695 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/647ea438-8124-4d89-8186-4a97f5fcc48f-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.379702 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e1efdbe3-5844-4335-959d-378864298dc1-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.382917 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.383585 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts" (OuterVolumeSpecName: "scripts") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.383814 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh" (OuterVolumeSpecName: "kube-api-access-s9rdh") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "kube-api-access-s9rdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.385818 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.394470 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.394813 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.421693 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data" (OuterVolumeSpecName: "config-data") pod "647ea438-8124-4d89-8186-4a97f5fcc48f" (UID: "647ea438-8124-4d89-8186-4a97f5fcc48f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481249 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481296 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481308 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481318 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481327 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/647ea438-8124-4d89-8186-4a97f5fcc48f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481337 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.481348 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9rdh\" (UniqueName: \"kubernetes.io/projected/647ea438-8124-4d89-8186-4a97f5fcc48f-kube-api-access-s9rdh\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.494020 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.494474 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.566842 4762 generic.go:334] "Generic (PLEG): container finished" podID="e1efdbe3-5844-4335-959d-378864298dc1" containerID="4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37" exitCode=0 Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.566906 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerDied","Data":"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37"} Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.566932 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e1efdbe3-5844-4335-959d-378864298dc1","Type":"ContainerDied","Data":"216bd0ebdecf38b4ceeb699a485c3ec4ea153cfed3aec9b8dfddaf94da103cb1"} Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.566948 4762 scope.go:117] "RemoveContainer" containerID="4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.567035 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.575983 4762 generic.go:334] "Generic (PLEG): container finished" podID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerID="7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a" exitCode=0 Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.576028 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.576034 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerDied","Data":"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a"} Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.576061 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"647ea438-8124-4d89-8186-4a97f5fcc48f","Type":"ContainerDied","Data":"c4050882ded574a05efa46436e6bac4dc451279895662cad98af26125eb96827"} Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.582566 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.582598 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.603986 4762 scope.go:117] "RemoveContainer" containerID="cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.612153 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.622683 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.628586 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.629340 4762 scope.go:117] "RemoveContainer" containerID="4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37" Feb 17 18:09:19 crc kubenswrapper[4762]: E0217 18:09:19.629871 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37\": container with ID starting with 4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37 not found: ID does not exist" containerID="4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.629953 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37"} err="failed to get container status \"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37\": rpc error: code = NotFound desc = could not find container \"4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37\": container with ID starting with 4def10ea07e1ade3f062b414022267c55ff1fb70256aeb058713cbf348536f37 not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.629976 4762 scope.go:117] "RemoveContainer" containerID="cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a" Feb 17 18:09:19 crc kubenswrapper[4762]: E0217 18:09:19.630284 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a\": container with ID starting with cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a not found: ID does not exist" containerID="cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.630341 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a"} err="failed to get container status \"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a\": rpc error: code = NotFound desc = could not find container \"cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a\": container with ID starting with cc3ebfa6fbdc86084b0628ed71d650d023244a614bf628658949d1c34d0bba4a not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.630370 4762 scope.go:117] "RemoveContainer" containerID="7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.634358 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.656577 4762 scope.go:117] "RemoveContainer" containerID="19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.673964 4762 scope.go:117] "RemoveContainer" containerID="7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a" Feb 17 18:09:19 crc kubenswrapper[4762]: E0217 18:09:19.674482 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a\": container with ID starting with 7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a not found: ID does not exist" containerID="7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.674514 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a"} err="failed to get container status \"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a\": rpc error: code = NotFound desc = could not find container \"7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a\": container with ID starting with 7576e4ecb87229a9968eeaff70550e82a42a7bf1b84b444f10a40edeba7b1c7a not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.674553 4762 scope.go:117] "RemoveContainer" containerID="19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d" Feb 17 18:09:19 crc kubenswrapper[4762]: E0217 18:09:19.674829 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d\": container with ID starting with 19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d not found: ID does not exist" containerID="19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d" Feb 17 18:09:19 crc kubenswrapper[4762]: I0217 18:09:19.674853 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d"} err="failed to get container status \"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d\": rpc error: code = NotFound desc = could not find container \"19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d\": container with ID starting with 19dda477ebb7673cc8b722fc10b623bb012f00fc6716d9efa3ccf998bf801d5d not found: ID does not exist" Feb 17 18:09:20 crc kubenswrapper[4762]: I0217 18:09:20.650902 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:09:20 crc kubenswrapper[4762]: I0217 18:09:20.651668 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-log" containerID="cri-o://a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f" gracePeriod=30 Feb 17 18:09:20 crc kubenswrapper[4762]: I0217 18:09:20.651881 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-httpd" containerID="cri-o://0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9" gracePeriod=30 Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.055431 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" path="/var/lib/kubelet/pods/647ea438-8124-4d89-8186-4a97f5fcc48f/volumes" Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.057752 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1efdbe3-5844-4335-959d-378864298dc1" path="/var/lib/kubelet/pods/e1efdbe3-5844-4335-959d-378864298dc1/volumes" Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.165362 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.165872 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-log" containerID="cri-o://5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401" gracePeriod=30 Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.165982 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-httpd" containerID="cri-o://6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed" gracePeriod=30 Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.605913 4762 generic.go:334] "Generic (PLEG): container finished" podID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerID="a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f" exitCode=143 Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.605999 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerDied","Data":"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f"} Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.608203 4762 generic.go:334] "Generic (PLEG): container finished" podID="b8272a02-2571-40e7-9f82-8528eb93918c" containerID="5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401" exitCode=143 Feb 17 18:09:21 crc kubenswrapper[4762]: I0217 18:09:21.608256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerDied","Data":"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401"} Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.044040 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156502 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156594 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156619 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156642 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156679 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156716 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156766 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156837 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156894 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156926 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.156952 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157273 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157332 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glz9\" (UniqueName: \"kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157097 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs" (OuterVolumeSpecName: "logs") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157356 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme\") pod \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\" (UID: \"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157359 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157121 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys" (OuterVolumeSpecName: "sys") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157138 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157159 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev" (OuterVolumeSpecName: "dev") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157182 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run" (OuterVolumeSpecName: "run") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157526 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157968 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157983 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.157995 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158003 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158011 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158019 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158028 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158036 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.158044 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.163060 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.163127 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.163147 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9" (OuterVolumeSpecName: "kube-api-access-4glz9") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "kube-api-access-4glz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.164025 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts" (OuterVolumeSpecName: "scripts") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.187027 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data" (OuterVolumeSpecName: "config-data") pod "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" (UID: "ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.259961 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.260006 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.260053 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.260107 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.260124 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4glz9\" (UniqueName: \"kubernetes.io/projected/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf-kube-api-access-4glz9\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.275669 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.281272 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.361936 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.361990 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.597387 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.643108 4762 generic.go:334] "Generic (PLEG): container finished" podID="b8272a02-2571-40e7-9f82-8528eb93918c" containerID="6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed" exitCode=0 Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.643154 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerDied","Data":"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed"} Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.643204 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b8272a02-2571-40e7-9f82-8528eb93918c","Type":"ContainerDied","Data":"885ca62993dfe0a19a6385ed53cf93f8667c491f6ee2f320d502a15e0e10d4b2"} Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.643229 4762 scope.go:117] "RemoveContainer" containerID="6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.643195 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.648180 4762 generic.go:334] "Generic (PLEG): container finished" podID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerID="0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9" exitCode=0 Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.648224 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerDied","Data":"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9"} Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.648256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf","Type":"ContainerDied","Data":"55966b6e0c9ca648e148698cf8964273fc0ff72d9da178757a6d998a6cb1b449"} Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.648323 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.672147 4762 scope.go:117] "RemoveContainer" containerID="5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.684305 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.694670 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.704349 4762 scope.go:117] "RemoveContainer" containerID="6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed" Feb 17 18:09:24 crc kubenswrapper[4762]: E0217 18:09:24.704910 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed\": container with ID starting with 6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed not found: ID does not exist" containerID="6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.704956 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed"} err="failed to get container status \"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed\": rpc error: code = NotFound desc = could not find container \"6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed\": container with ID starting with 6f1d8d03e319341593921bf05b885c11b51b4c6f6a16ef24cdc34d5356d96aed not found: ID does not exist" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.704983 4762 scope.go:117] "RemoveContainer" containerID="5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401" Feb 17 18:09:24 crc kubenswrapper[4762]: E0217 18:09:24.705451 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401\": container with ID starting with 5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401 not found: ID does not exist" containerID="5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.705499 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401"} err="failed to get container status \"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401\": rpc error: code = NotFound desc = could not find container \"5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401\": container with ID starting with 5306cfb6e299a0cec558a2d69aad958be63f1aa871e72677d67c1ad9d9855401 not found: ID does not exist" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.705527 4762 scope.go:117] "RemoveContainer" containerID="0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.728798 4762 scope.go:117] "RemoveContainer" containerID="a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.744690 4762 scope.go:117] "RemoveContainer" containerID="0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9" Feb 17 18:09:24 crc kubenswrapper[4762]: E0217 18:09:24.745158 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9\": container with ID starting with 0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9 not found: ID does not exist" containerID="0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.745200 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9"} err="failed to get container status \"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9\": rpc error: code = NotFound desc = could not find container \"0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9\": container with ID starting with 0070d302f78c4c25d6e665305952f9301f39490029b9e12289f5f9f019f913e9 not found: ID does not exist" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.745231 4762 scope.go:117] "RemoveContainer" containerID="a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f" Feb 17 18:09:24 crc kubenswrapper[4762]: E0217 18:09:24.745682 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f\": container with ID starting with a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f not found: ID does not exist" containerID="a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.745714 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f"} err="failed to get container status \"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f\": rpc error: code = NotFound desc = could not find container \"a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f\": container with ID starting with a3f499cb46fe51c8313f908add9ba983a12f96d93ddb1d1d7ba61991a0b4449f not found: ID does not exist" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766330 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766435 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766470 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766517 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766571 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys" (OuterVolumeSpecName: "sys") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766589 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766669 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766748 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766772 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766822 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbt7\" (UniqueName: \"kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766846 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766837 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766883 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766899 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b8272a02-2571-40e7-9f82-8528eb93918c\" (UID: \"b8272a02-2571-40e7-9f82-8528eb93918c\") " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766905 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.766931 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run" (OuterVolumeSpecName: "run") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767113 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767151 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev" (OuterVolumeSpecName: "dev") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767193 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs" (OuterVolumeSpecName: "logs") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767434 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767434 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767585 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767812 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767896 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.767979 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.768040 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.768098 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8272a02-2571-40e7-9f82-8528eb93918c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.768152 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.770023 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts" (OuterVolumeSpecName: "scripts") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.770412 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.771044 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.771059 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7" (OuterVolumeSpecName: "kube-api-access-xfbt7") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "kube-api-access-xfbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.803423 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data" (OuterVolumeSpecName: "config-data") pod "b8272a02-2571-40e7-9f82-8528eb93918c" (UID: "b8272a02-2571-40e7-9f82-8528eb93918c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869886 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869919 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbt7\" (UniqueName: \"kubernetes.io/projected/b8272a02-2571-40e7-9f82-8528eb93918c-kube-api-access-xfbt7\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869930 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869958 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869975 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869988 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b8272a02-2571-40e7-9f82-8528eb93918c-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.869998 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8272a02-2571-40e7-9f82-8528eb93918c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.887817 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.890456 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.973783 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.973831 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.977818 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:09:24 crc kubenswrapper[4762]: I0217 18:09:24.984553 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.047638 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" path="/var/lib/kubelet/pods/b8272a02-2571-40e7-9f82-8528eb93918c/volumes" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.048445 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" path="/var/lib/kubelet/pods/ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf/volumes" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.607275 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-c4hz2"] Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.618164 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-c4hz2"] Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658388 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glanceaee5-account-delete-n54jl"] Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658745 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658760 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658776 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658782 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658789 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658795 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658804 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658810 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658819 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658824 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658835 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658840 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658851 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658857 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658867 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658874 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658885 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658892 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658909 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658916 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658928 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658934 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: E0217 18:09:25.658944 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.658951 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659066 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659079 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3cd3d6-d64b-4fa7-98ad-7b0b45edcacf" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659088 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659097 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659108 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659114 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659121 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659128 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="647ea438-8124-4d89-8186-4a97f5fcc48f" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659135 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="116a936d-e7fd-4c06-b3a0-1b56d0f7f30c" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659142 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7d89a9-6206-4ffe-bd91-a407c2066f23" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659151 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1efdbe3-5844-4335-959d-378864298dc1" containerName="glance-httpd" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659161 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8272a02-2571-40e7-9f82-8528eb93918c" containerName="glance-log" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.659600 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.665703 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceaee5-account-delete-n54jl"] Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.785947 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp48f\" (UniqueName: \"kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.786142 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.886929 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp48f\" (UniqueName: \"kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.887055 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.887891 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.903193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp48f\" (UniqueName: \"kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f\") pod \"glanceaee5-account-delete-n54jl\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:25 crc kubenswrapper[4762]: I0217 18:09:25.980500 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:26 crc kubenswrapper[4762]: I0217 18:09:26.404704 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceaee5-account-delete-n54jl"] Feb 17 18:09:26 crc kubenswrapper[4762]: I0217 18:09:26.669553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" event={"ID":"45c6b782-5d89-4555-add8-1b54c8d76565","Type":"ContainerStarted","Data":"7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924"} Feb 17 18:09:26 crc kubenswrapper[4762]: I0217 18:09:26.669598 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" event={"ID":"45c6b782-5d89-4555-add8-1b54c8d76565","Type":"ContainerStarted","Data":"5f4138dac09dc0a4e2a049c2c5620cf8278f52949c1d29062226b2fed7098a91"} Feb 17 18:09:26 crc kubenswrapper[4762]: I0217 18:09:26.683321 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" podStartSLOduration=1.6833025350000002 podStartE2EDuration="1.683302535s" podCreationTimestamp="2026-02-17 18:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:26.682814831 +0000 UTC m=+1318.327732841" watchObservedRunningTime="2026-02-17 18:09:26.683302535 +0000 UTC m=+1318.328220545" Feb 17 18:09:26 crc kubenswrapper[4762]: E0217 18:09:26.846905 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c6b782_5d89_4555_add8_1b54c8d76565.slice/crio-conmon-7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c6b782_5d89_4555_add8_1b54c8d76565.slice/crio-7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:09:27 crc kubenswrapper[4762]: I0217 18:09:27.044992 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45cef97c-2248-4f8d-9e35-0f8d9db0e6be" path="/var/lib/kubelet/pods/45cef97c-2248-4f8d-9e35-0f8d9db0e6be/volumes" Feb 17 18:09:27 crc kubenswrapper[4762]: I0217 18:09:27.678373 4762 generic.go:334] "Generic (PLEG): container finished" podID="45c6b782-5d89-4555-add8-1b54c8d76565" containerID="7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924" exitCode=0 Feb 17 18:09:27 crc kubenswrapper[4762]: I0217 18:09:27.678438 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" event={"ID":"45c6b782-5d89-4555-add8-1b54c8d76565","Type":"ContainerDied","Data":"7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924"} Feb 17 18:09:28 crc kubenswrapper[4762]: I0217 18:09:28.969820 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.135997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp48f\" (UniqueName: \"kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f\") pod \"45c6b782-5d89-4555-add8-1b54c8d76565\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.136158 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts\") pod \"45c6b782-5d89-4555-add8-1b54c8d76565\" (UID: \"45c6b782-5d89-4555-add8-1b54c8d76565\") " Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.136661 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45c6b782-5d89-4555-add8-1b54c8d76565" (UID: "45c6b782-5d89-4555-add8-1b54c8d76565"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.150538 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f" (OuterVolumeSpecName: "kube-api-access-bp48f") pod "45c6b782-5d89-4555-add8-1b54c8d76565" (UID: "45c6b782-5d89-4555-add8-1b54c8d76565"). InnerVolumeSpecName "kube-api-access-bp48f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.237691 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c6b782-5d89-4555-add8-1b54c8d76565-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.237727 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp48f\" (UniqueName: \"kubernetes.io/projected/45c6b782-5d89-4555-add8-1b54c8d76565-kube-api-access-bp48f\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.696556 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" event={"ID":"45c6b782-5d89-4555-add8-1b54c8d76565","Type":"ContainerDied","Data":"5f4138dac09dc0a4e2a049c2c5620cf8278f52949c1d29062226b2fed7098a91"} Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.696603 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f4138dac09dc0a4e2a049c2c5620cf8278f52949c1d29062226b2fed7098a91" Feb 17 18:09:29 crc kubenswrapper[4762]: I0217 18:09:29.696606 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaee5-account-delete-n54jl" Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.683475 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-xxctl"] Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.694130 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-xxctl"] Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.727369 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glanceaee5-account-delete-n54jl"] Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.744890 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glanceaee5-account-delete-n54jl"] Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.750511 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-aee5-account-create-update-8x8tp"] Feb 17 18:09:30 crc kubenswrapper[4762]: I0217 18:09:30.755491 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-aee5-account-create-update-8x8tp"] Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.033342 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-xpjjp"] Feb 17 18:09:31 crc kubenswrapper[4762]: E0217 18:09:31.033890 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c6b782-5d89-4555-add8-1b54c8d76565" containerName="mariadb-account-delete" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.033921 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c6b782-5d89-4555-add8-1b54c8d76565" containerName="mariadb-account-delete" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.034146 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c6b782-5d89-4555-add8-1b54c8d76565" containerName="mariadb-account-delete" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.035073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.045799 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359b218a-4867-47c2-ab60-f717d3105e86" path="/var/lib/kubelet/pods/359b218a-4867-47c2-ab60-f717d3105e86/volumes" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.046929 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3736be6a-0bad-4095-bed7-301ba6790a21" path="/var/lib/kubelet/pods/3736be6a-0bad-4095-bed7-301ba6790a21/volumes" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.047559 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c6b782-5d89-4555-add8-1b54c8d76565" path="/var/lib/kubelet/pods/45c6b782-5d89-4555-add8-1b54c8d76565/volumes" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.048199 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-8c99-account-create-update-8qf6v"] Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.049259 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-xpjjp"] Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.049358 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.053058 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.057011 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-8c99-account-create-update-8qf6v"] Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.167806 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.168161 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zfsx\" (UniqueName: \"kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.168244 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.168405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2rr\" (UniqueName: \"kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.269920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.270043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zfsx\" (UniqueName: \"kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.270080 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.270106 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2rr\" (UniqueName: \"kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.271146 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.271368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.296239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zfsx\" (UniqueName: \"kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx\") pod \"glance-db-create-xpjjp\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.296276 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2rr\" (UniqueName: \"kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr\") pod \"glance-8c99-account-create-update-8qf6v\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.354883 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.376942 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.780974 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-xpjjp"] Feb 17 18:09:31 crc kubenswrapper[4762]: I0217 18:09:31.832995 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-8c99-account-create-update-8qf6v"] Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.724803 4762 generic.go:334] "Generic (PLEG): container finished" podID="cd6e1e23-e91b-4d8d-a520-4d599a2d1538" containerID="e1fd2c667d472a338cbffe4a797dca090eb418b87a6e0cc58b0885d1b8252c4f" exitCode=0 Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.724933 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xpjjp" event={"ID":"cd6e1e23-e91b-4d8d-a520-4d599a2d1538","Type":"ContainerDied","Data":"e1fd2c667d472a338cbffe4a797dca090eb418b87a6e0cc58b0885d1b8252c4f"} Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.725193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xpjjp" event={"ID":"cd6e1e23-e91b-4d8d-a520-4d599a2d1538","Type":"ContainerStarted","Data":"ddd2b3f5daddc447f6218a7b34a19d320730e8dbc256a8c52040fa4f4038ceb8"} Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.727273 4762 generic.go:334] "Generic (PLEG): container finished" podID="084491d6-9b5a-4a3c-b8b5-c887667fa167" containerID="737c3bd012c747cbfd3b4daa6fe52ee9322e7cc31d64b2e47335f49d62877219" exitCode=0 Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.727322 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" event={"ID":"084491d6-9b5a-4a3c-b8b5-c887667fa167","Type":"ContainerDied","Data":"737c3bd012c747cbfd3b4daa6fe52ee9322e7cc31d64b2e47335f49d62877219"} Feb 17 18:09:32 crc kubenswrapper[4762]: I0217 18:09:32.727353 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" event={"ID":"084491d6-9b5a-4a3c-b8b5-c887667fa167","Type":"ContainerStarted","Data":"b7fc8a66b2e6f29e07a37ed280bd02471deec91240d7bb505184d27f6302d6c1"} Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.107907 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.115440 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.228373 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd2rr\" (UniqueName: \"kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr\") pod \"084491d6-9b5a-4a3c-b8b5-c887667fa167\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.228601 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts\") pod \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.228833 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts\") pod \"084491d6-9b5a-4a3c-b8b5-c887667fa167\" (UID: \"084491d6-9b5a-4a3c-b8b5-c887667fa167\") " Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.228888 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zfsx\" (UniqueName: \"kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx\") pod \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\" (UID: \"cd6e1e23-e91b-4d8d-a520-4d599a2d1538\") " Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.229410 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd6e1e23-e91b-4d8d-a520-4d599a2d1538" (UID: "cd6e1e23-e91b-4d8d-a520-4d599a2d1538"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.229520 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "084491d6-9b5a-4a3c-b8b5-c887667fa167" (UID: "084491d6-9b5a-4a3c-b8b5-c887667fa167"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.229590 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.234138 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx" (OuterVolumeSpecName: "kube-api-access-9zfsx") pod "cd6e1e23-e91b-4d8d-a520-4d599a2d1538" (UID: "cd6e1e23-e91b-4d8d-a520-4d599a2d1538"). InnerVolumeSpecName "kube-api-access-9zfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.234658 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr" (OuterVolumeSpecName: "kube-api-access-jd2rr") pod "084491d6-9b5a-4a3c-b8b5-c887667fa167" (UID: "084491d6-9b5a-4a3c-b8b5-c887667fa167"). InnerVolumeSpecName "kube-api-access-jd2rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.331066 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd2rr\" (UniqueName: \"kubernetes.io/projected/084491d6-9b5a-4a3c-b8b5-c887667fa167-kube-api-access-jd2rr\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.331117 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084491d6-9b5a-4a3c-b8b5-c887667fa167-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.331130 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zfsx\" (UniqueName: \"kubernetes.io/projected/cd6e1e23-e91b-4d8d-a520-4d599a2d1538-kube-api-access-9zfsx\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.557993 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.558083 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.752218 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.752261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8c99-account-create-update-8qf6v" event={"ID":"084491d6-9b5a-4a3c-b8b5-c887667fa167","Type":"ContainerDied","Data":"b7fc8a66b2e6f29e07a37ed280bd02471deec91240d7bb505184d27f6302d6c1"} Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.752317 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7fc8a66b2e6f29e07a37ed280bd02471deec91240d7bb505184d27f6302d6c1" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.754677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-xpjjp" event={"ID":"cd6e1e23-e91b-4d8d-a520-4d599a2d1538","Type":"ContainerDied","Data":"ddd2b3f5daddc447f6218a7b34a19d320730e8dbc256a8c52040fa4f4038ceb8"} Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.754720 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd2b3f5daddc447f6218a7b34a19d320730e8dbc256a8c52040fa4f4038ceb8" Feb 17 18:09:34 crc kubenswrapper[4762]: I0217 18:09:34.754799 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-xpjjp" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.172087 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-f8lg8"] Feb 17 18:09:36 crc kubenswrapper[4762]: E0217 18:09:36.173043 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084491d6-9b5a-4a3c-b8b5-c887667fa167" containerName="mariadb-account-create-update" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.173099 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="084491d6-9b5a-4a3c-b8b5-c887667fa167" containerName="mariadb-account-create-update" Feb 17 18:09:36 crc kubenswrapper[4762]: E0217 18:09:36.173162 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6e1e23-e91b-4d8d-a520-4d599a2d1538" containerName="mariadb-database-create" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.173180 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6e1e23-e91b-4d8d-a520-4d599a2d1538" containerName="mariadb-database-create" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.173538 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="084491d6-9b5a-4a3c-b8b5-c887667fa167" containerName="mariadb-account-create-update" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.173580 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd6e1e23-e91b-4d8d-a520-4d599a2d1538" containerName="mariadb-database-create" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.174708 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.178601 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-48hqc" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.183098 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-f8lg8"] Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.185461 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.265217 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.265286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlmxt\" (UniqueName: \"kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.265329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.367141 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.367221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlmxt\" (UniqueName: \"kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.367277 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.375895 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.384963 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.400967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlmxt\" (UniqueName: \"kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt\") pod \"glance-db-sync-f8lg8\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.502831 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:36 crc kubenswrapper[4762]: I0217 18:09:36.927179 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-f8lg8"] Feb 17 18:09:37 crc kubenswrapper[4762]: I0217 18:09:37.777726 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-f8lg8" event={"ID":"20676a86-92bc-497c-8bdd-54e5d7483c3c","Type":"ContainerStarted","Data":"d3d0daa38814fac1e03514e3f4f29de2591d070977bbe272db3214fe0cb28257"} Feb 17 18:09:37 crc kubenswrapper[4762]: I0217 18:09:37.778088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-f8lg8" event={"ID":"20676a86-92bc-497c-8bdd-54e5d7483c3c","Type":"ContainerStarted","Data":"c4587f28de829bde41ceee99b656dd59822f1998216b979d180fd6bad2a469de"} Feb 17 18:09:37 crc kubenswrapper[4762]: I0217 18:09:37.793484 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-f8lg8" podStartSLOduration=1.7934696570000002 podStartE2EDuration="1.793469657s" podCreationTimestamp="2026-02-17 18:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:37.791638725 +0000 UTC m=+1329.436556745" watchObservedRunningTime="2026-02-17 18:09:37.793469657 +0000 UTC m=+1329.438387667" Feb 17 18:09:40 crc kubenswrapper[4762]: I0217 18:09:40.800142 4762 generic.go:334] "Generic (PLEG): container finished" podID="20676a86-92bc-497c-8bdd-54e5d7483c3c" containerID="d3d0daa38814fac1e03514e3f4f29de2591d070977bbe272db3214fe0cb28257" exitCode=0 Feb 17 18:09:40 crc kubenswrapper[4762]: I0217 18:09:40.800215 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-f8lg8" event={"ID":"20676a86-92bc-497c-8bdd-54e5d7483c3c","Type":"ContainerDied","Data":"d3d0daa38814fac1e03514e3f4f29de2591d070977bbe272db3214fe0cb28257"} Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.095335 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.255892 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlmxt\" (UniqueName: \"kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt\") pod \"20676a86-92bc-497c-8bdd-54e5d7483c3c\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.256013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data\") pod \"20676a86-92bc-497c-8bdd-54e5d7483c3c\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.256058 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data\") pod \"20676a86-92bc-497c-8bdd-54e5d7483c3c\" (UID: \"20676a86-92bc-497c-8bdd-54e5d7483c3c\") " Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.261407 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20676a86-92bc-497c-8bdd-54e5d7483c3c" (UID: "20676a86-92bc-497c-8bdd-54e5d7483c3c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.261693 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt" (OuterVolumeSpecName: "kube-api-access-qlmxt") pod "20676a86-92bc-497c-8bdd-54e5d7483c3c" (UID: "20676a86-92bc-497c-8bdd-54e5d7483c3c"). InnerVolumeSpecName "kube-api-access-qlmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.299248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data" (OuterVolumeSpecName: "config-data") pod "20676a86-92bc-497c-8bdd-54e5d7483c3c" (UID: "20676a86-92bc-497c-8bdd-54e5d7483c3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.358003 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlmxt\" (UniqueName: \"kubernetes.io/projected/20676a86-92bc-497c-8bdd-54e5d7483c3c-kube-api-access-qlmxt\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.358056 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.358070 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20676a86-92bc-497c-8bdd-54e5d7483c3c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.819495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-f8lg8" event={"ID":"20676a86-92bc-497c-8bdd-54e5d7483c3c","Type":"ContainerDied","Data":"c4587f28de829bde41ceee99b656dd59822f1998216b979d180fd6bad2a469de"} Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.819537 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4587f28de829bde41ceee99b656dd59822f1998216b979d180fd6bad2a469de" Feb 17 18:09:42 crc kubenswrapper[4762]: I0217 18:09:42.819568 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-f8lg8" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.156696 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:43 crc kubenswrapper[4762]: E0217 18:09:43.157039 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20676a86-92bc-497c-8bdd-54e5d7483c3c" containerName="glance-db-sync" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.157052 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="20676a86-92bc-497c-8bdd-54e5d7483c3c" containerName="glance-db-sync" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.157211 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="20676a86-92bc-497c-8bdd-54e5d7483c3c" containerName="glance-db-sync" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.158007 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.160500 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.160607 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.160665 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-48hqc" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.168578 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271487 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271505 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271525 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271542 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271590 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271615 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271631 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271669 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271702 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271716 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271741 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rc2j\" (UniqueName: \"kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.271965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373297 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373325 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373367 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373413 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373452 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373461 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373422 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373616 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rc2j\" (UniqueName: \"kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373799 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373839 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373856 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.373973 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.374059 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.374133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.374580 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.380883 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.382299 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.403303 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rc2j\" (UniqueName: \"kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.420908 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.439807 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.472380 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:43 crc kubenswrapper[4762]: I0217 18:09:43.893439 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.028039 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.837891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerStarted","Data":"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f"} Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.838789 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerStarted","Data":"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e"} Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.838821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerStarted","Data":"a771dc9ea39459572a411a88133863ac65097e757612777094fde4c933f0172a"} Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.838024 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-log" containerID="cri-o://e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" gracePeriod=30 Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.838084 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-httpd" containerID="cri-o://a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" gracePeriod=30 Feb 17 18:09:44 crc kubenswrapper[4762]: I0217 18:09:44.864018 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.8639993910000001 podStartE2EDuration="1.863999391s" podCreationTimestamp="2026-02-17 18:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:44.859845253 +0000 UTC m=+1336.504763263" watchObservedRunningTime="2026-02-17 18:09:44.863999391 +0000 UTC m=+1336.508917401" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.236141 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313515 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313611 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313632 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313663 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313701 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313733 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313758 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313774 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313791 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313807 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313838 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rc2j\" (UniqueName: \"kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run" (OuterVolumeSpecName: "run") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313852 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313860 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev" (OuterVolumeSpecName: "dev") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313909 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys" (OuterVolumeSpecName: "sys") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313919 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313928 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313944 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.313977 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a4e5fff9-85c6-46ff-95d0-b330a8872471\" (UID: \"a4e5fff9-85c6-46ff-95d0-b330a8872471\") " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314154 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs" (OuterVolumeSpecName: "logs") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314295 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314309 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314318 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314326 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314318 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314333 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314380 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314395 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.314608 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.318523 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.318757 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts" (OuterVolumeSpecName: "scripts") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.319083 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j" (OuterVolumeSpecName: "kube-api-access-7rc2j") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "kube-api-access-7rc2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.323759 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.356529 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data" (OuterVolumeSpecName: "config-data") pod "a4e5fff9-85c6-46ff-95d0-b330a8872471" (UID: "a4e5fff9-85c6-46ff-95d0-b330a8872471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415772 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415812 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415824 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4e5fff9-85c6-46ff-95d0-b330a8872471-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415841 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415854 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4e5fff9-85c6-46ff-95d0-b330a8872471-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415865 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rc2j\" (UniqueName: \"kubernetes.io/projected/a4e5fff9-85c6-46ff-95d0-b330a8872471-kube-api-access-7rc2j\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.415878 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a4e5fff9-85c6-46ff-95d0-b330a8872471-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.428765 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.429295 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.517467 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.517503 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.846362 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerID="a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" exitCode=143 Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.846409 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.847370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerDied","Data":"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f"} Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.847396 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerID="e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" exitCode=143 Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.847432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerDied","Data":"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e"} Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.847459 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a4e5fff9-85c6-46ff-95d0-b330a8872471","Type":"ContainerDied","Data":"a771dc9ea39459572a411a88133863ac65097e757612777094fde4c933f0172a"} Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.847489 4762 scope.go:117] "RemoveContainer" containerID="a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.880861 4762 scope.go:117] "RemoveContainer" containerID="e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.883328 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.891785 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.904155 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:45 crc kubenswrapper[4762]: E0217 18:09:45.904480 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-log" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.904494 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-log" Feb 17 18:09:45 crc kubenswrapper[4762]: E0217 18:09:45.904513 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-httpd" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.904523 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-httpd" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.904681 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-log" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.904700 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" containerName="glance-httpd" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.905224 4762 scope.go:117] "RemoveContainer" containerID="a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.905621 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:45 crc kubenswrapper[4762]: E0217 18:09:45.906099 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f\": container with ID starting with a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f not found: ID does not exist" containerID="a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.906137 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f"} err="failed to get container status \"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f\": rpc error: code = NotFound desc = could not find container \"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f\": container with ID starting with a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f not found: ID does not exist" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.906165 4762 scope.go:117] "RemoveContainer" containerID="e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" Feb 17 18:09:45 crc kubenswrapper[4762]: E0217 18:09:45.907679 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e\": container with ID starting with e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e not found: ID does not exist" containerID="e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.907705 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e"} err="failed to get container status \"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e\": rpc error: code = NotFound desc = could not find container \"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e\": container with ID starting with e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e not found: ID does not exist" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.907728 4762 scope.go:117] "RemoveContainer" containerID="a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.908188 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f"} err="failed to get container status \"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f\": rpc error: code = NotFound desc = could not find container \"a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f\": container with ID starting with a863041bd459a8f1bc6e68cd3fc2e368dd612281367692d786595cf39687721f not found: ID does not exist" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.908211 4762 scope.go:117] "RemoveContainer" containerID="e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.908440 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e"} err="failed to get container status \"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e\": rpc error: code = NotFound desc = could not find container \"e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e\": container with ID starting with e7f5a231b3571f77506c5660b48ef4fb3a7ac583f7a191477e35ade7cb07ad4e not found: ID does not exist" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.910841 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.911105 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.910868 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-48hqc" Feb 17 18:09:45 crc kubenswrapper[4762]: I0217 18:09:45.914545 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.024743 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025166 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025220 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025299 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025330 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025358 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025411 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025461 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc68r\" (UniqueName: \"kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025499 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.025519 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126927 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126955 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.126990 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127013 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127044 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127074 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127093 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127121 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127160 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc68r\" (UniqueName: \"kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127225 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127697 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127730 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127813 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.127988 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.128136 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.128233 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.128246 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.128324 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.128313 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.130688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.131387 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.144769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc68r\" (UniqueName: \"kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.150705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.169134 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.231103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.448715 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:09:46 crc kubenswrapper[4762]: W0217 18:09:46.454780 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626e4e49_97a1_48c5_91aa_b612406db7b8.slice/crio-f7c253f30d1e46949c6ef0e624358c33a2051b5cb59610deadbee9421f0c61e6 WatchSource:0}: Error finding container f7c253f30d1e46949c6ef0e624358c33a2051b5cb59610deadbee9421f0c61e6: Status 404 returned error can't find the container with id f7c253f30d1e46949c6ef0e624358c33a2051b5cb59610deadbee9421f0c61e6 Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.857915 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerStarted","Data":"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f"} Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.858362 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerStarted","Data":"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77"} Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.858378 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerStarted","Data":"f7c253f30d1e46949c6ef0e624358c33a2051b5cb59610deadbee9421f0c61e6"} Feb 17 18:09:46 crc kubenswrapper[4762]: I0217 18:09:46.879102 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.879083082 podStartE2EDuration="1.879083082s" podCreationTimestamp="2026-02-17 18:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:09:46.876033525 +0000 UTC m=+1338.520951535" watchObservedRunningTime="2026-02-17 18:09:46.879083082 +0000 UTC m=+1338.524001092" Feb 17 18:09:47 crc kubenswrapper[4762]: I0217 18:09:47.046840 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e5fff9-85c6-46ff-95d0-b330a8872471" path="/var/lib/kubelet/pods/a4e5fff9-85c6-46ff-95d0-b330a8872471/volumes" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.232911 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.233462 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.262913 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.284470 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.929963 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:56 crc kubenswrapper[4762]: I0217 18:09:56.930229 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:58 crc kubenswrapper[4762]: I0217 18:09:58.895836 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:09:58 crc kubenswrapper[4762]: I0217 18:09:58.940570 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.068462 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.069883 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.080323 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.081330 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.081410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.087447 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.264468 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.264930 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.264969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265059 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265284 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265312 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265338 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265382 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntghn\" (UniqueName: \"kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265586 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265706 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwlq\" (UniqueName: \"kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265791 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265834 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265866 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265883 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265927 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265947 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.265970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.266087 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.266181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367771 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntghn\" (UniqueName: \"kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367862 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367934 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367956 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwlq\" (UniqueName: \"kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.367984 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368084 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368106 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368150 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368183 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368215 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368264 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368287 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368368 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368422 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368441 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368461 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368498 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368586 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.368655 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369248 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369319 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369554 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369523 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369647 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369685 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369788 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369823 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369892 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369902 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.369933 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370018 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370097 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370160 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.370615 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.376263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.377096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.378341 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.389820 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.391534 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntghn\" (UniqueName: \"kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.392446 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwlq\" (UniqueName: \"kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.404449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.404862 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.405449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.406139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.449457 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.457608 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.876983 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.922525 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:01 crc kubenswrapper[4762]: W0217 18:10:01.924365 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbf6bdac_22d8_4139_8949_6ee2b97c9c6e.slice/crio-8bfbacb7e2c876e604b59b64519c6534bf5f156be16f7e938bf7c7252994c3f7 WatchSource:0}: Error finding container 8bfbacb7e2c876e604b59b64519c6534bf5f156be16f7e938bf7c7252994c3f7: Status 404 returned error can't find the container with id 8bfbacb7e2c876e604b59b64519c6534bf5f156be16f7e938bf7c7252994c3f7 Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.962702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerStarted","Data":"d873151305d05c6db82637b886d23df09d43d5a0b9b72bf6edab181a973f3ba0"} Feb 17 18:10:01 crc kubenswrapper[4762]: I0217 18:10:01.963615 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerStarted","Data":"8bfbacb7e2c876e604b59b64519c6534bf5f156be16f7e938bf7c7252994c3f7"} Feb 17 18:10:02 crc kubenswrapper[4762]: I0217 18:10:02.973150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerStarted","Data":"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56"} Feb 17 18:10:02 crc kubenswrapper[4762]: I0217 18:10:02.973447 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerStarted","Data":"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2"} Feb 17 18:10:02 crc kubenswrapper[4762]: I0217 18:10:02.977636 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerStarted","Data":"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d"} Feb 17 18:10:02 crc kubenswrapper[4762]: I0217 18:10:02.977690 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerStarted","Data":"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886"} Feb 17 18:10:03 crc kubenswrapper[4762]: I0217 18:10:03.008588 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.008568401 podStartE2EDuration="3.008568401s" podCreationTimestamp="2026-02-17 18:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:03.003938489 +0000 UTC m=+1354.648856499" watchObservedRunningTime="2026-02-17 18:10:03.008568401 +0000 UTC m=+1354.653486421" Feb 17 18:10:03 crc kubenswrapper[4762]: I0217 18:10:03.035375 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.035360043 podStartE2EDuration="3.035360043s" podCreationTimestamp="2026-02-17 18:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:03.030906306 +0000 UTC m=+1354.675824316" watchObservedRunningTime="2026-02-17 18:10:03.035360043 +0000 UTC m=+1354.680278053" Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.558739 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.559460 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.559530 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.560539 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.560673 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885" gracePeriod=600 Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.999194 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885" exitCode=0 Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.999276 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885"} Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.999481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b"} Feb 17 18:10:04 crc kubenswrapper[4762]: I0217 18:10:04.999501 4762 scope.go:117] "RemoveContainer" containerID="7383a3a662a9b124ecf96d7abf64c6e25de420f4076f78c28ca4eeb9a1cb55f6" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.450465 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.451005 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.459287 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.459334 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.475256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.491355 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.496923 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:11 crc kubenswrapper[4762]: I0217 18:10:11.517948 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:12 crc kubenswrapper[4762]: I0217 18:10:12.066097 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:12 crc kubenswrapper[4762]: I0217 18:10:12.066162 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:12 crc kubenswrapper[4762]: I0217 18:10:12.066173 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:12 crc kubenswrapper[4762]: I0217 18:10:12.066181 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:14 crc kubenswrapper[4762]: I0217 18:10:14.039886 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:14 crc kubenswrapper[4762]: I0217 18:10:14.040548 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:14 crc kubenswrapper[4762]: I0217 18:10:14.066335 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:14 crc kubenswrapper[4762]: I0217 18:10:14.076241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:15 crc kubenswrapper[4762]: I0217 18:10:15.419540 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:15 crc kubenswrapper[4762]: I0217 18:10:15.431365 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.143330 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-log" containerID="cri-o://bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2" gracePeriod=30 Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.143527 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-log" containerID="cri-o://0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886" gracePeriod=30 Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.143840 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-httpd" containerID="cri-o://7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56" gracePeriod=30 Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.144096 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-httpd" containerID="cri-o://030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d" gracePeriod=30 Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.152084 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-1" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.139:9292/healthcheck\": EOF" Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.158269 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-2" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.138:9292/healthcheck\": EOF" Feb 17 18:10:16 crc kubenswrapper[4762]: I0217 18:10:16.159031 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-1" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.139:9292/healthcheck\": EOF" Feb 17 18:10:17 crc kubenswrapper[4762]: I0217 18:10:17.152457 4762 generic.go:334] "Generic (PLEG): container finished" podID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerID="bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2" exitCode=143 Feb 17 18:10:17 crc kubenswrapper[4762]: I0217 18:10:17.152525 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerDied","Data":"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2"} Feb 17 18:10:17 crc kubenswrapper[4762]: I0217 18:10:17.154658 4762 generic.go:334] "Generic (PLEG): container finished" podID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerID="0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886" exitCode=143 Feb 17 18:10:17 crc kubenswrapper[4762]: I0217 18:10:17.154686 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerDied","Data":"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886"} Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.847244 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.900268 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995724 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995816 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995840 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995886 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995890 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995907 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev" (OuterVolumeSpecName: "dev") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995918 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995938 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995952 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995974 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.995996 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996042 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996068 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs" (OuterVolumeSpecName: "logs") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996088 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996105 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996135 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996168 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996179 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run" (OuterVolumeSpecName: "run") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996213 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996227 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996266 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntghn\" (UniqueName: \"kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996302 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996325 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996227 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs" (OuterVolumeSpecName: "logs") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996238 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys" (OuterVolumeSpecName: "sys") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996227 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996265 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996532 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996674 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996723 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llwlq\" (UniqueName: \"kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996744 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996803 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996816 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run" (OuterVolumeSpecName: "run") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996823 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996844 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys" (OuterVolumeSpecName: "sys") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996831 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev" (OuterVolumeSpecName: "dev") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996872 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996897 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996906 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme\") pod \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\" (UID: \"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.996935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\" (UID: \"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e\") " Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997277 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997429 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997449 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997463 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997474 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997508 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997531 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997551 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997567 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997582 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997596 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997610 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997649 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997677 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997694 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997709 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997724 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997739 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:19 crc kubenswrapper[4762]: I0217 18:10:19.997753 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.001759 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.001938 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.001972 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn" (OuterVolumeSpecName: "kube-api-access-ntghn") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "kube-api-access-ntghn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.002690 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq" (OuterVolumeSpecName: "kube-api-access-llwlq") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "kube-api-access-llwlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.002873 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts" (OuterVolumeSpecName: "scripts") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.003110 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.004802 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.004867 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts" (OuterVolumeSpecName: "scripts") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.034239 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data" (OuterVolumeSpecName: "config-data") pod "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" (UID: "fbf6bdac-22d8-4139-8949-6ee2b97c9c6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.045266 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data" (OuterVolumeSpecName: "config-data") pod "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" (UID: "eb5ec1b3-9751-411f-a9c6-733f7d4e83d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099664 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099710 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099725 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099737 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099794 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099804 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntghn\" (UniqueName: \"kubernetes.io/projected/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-kube-api-access-ntghn\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099816 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099824 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099833 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llwlq\" (UniqueName: \"kubernetes.io/projected/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1-kube-api-access-llwlq\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.099846 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.114076 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.116029 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.116120 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.116702 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.181734 4762 generic.go:334] "Generic (PLEG): container finished" podID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerID="7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56" exitCode=0 Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.181803 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerDied","Data":"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56"} Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.181838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"fbf6bdac-22d8-4139-8949-6ee2b97c9c6e","Type":"ContainerDied","Data":"8bfbacb7e2c876e604b59b64519c6534bf5f156be16f7e938bf7c7252994c3f7"} Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.181856 4762 scope.go:117] "RemoveContainer" containerID="7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.181981 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.192072 4762 generic.go:334] "Generic (PLEG): container finished" podID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerID="030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d" exitCode=0 Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.192114 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerDied","Data":"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d"} Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.192141 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"eb5ec1b3-9751-411f-a9c6-733f7d4e83d1","Type":"ContainerDied","Data":"d873151305d05c6db82637b886d23df09d43d5a0b9b72bf6edab181a973f3ba0"} Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.192193 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.200826 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.200850 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.200859 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.200867 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.220418 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.224730 4762 scope.go:117] "RemoveContainer" containerID="bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.234428 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.247360 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.251760 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.259332 4762 scope.go:117] "RemoveContainer" containerID="7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56" Feb 17 18:10:20 crc kubenswrapper[4762]: E0217 18:10:20.263089 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56\": container with ID starting with 7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56 not found: ID does not exist" containerID="7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.263145 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56"} err="failed to get container status \"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56\": rpc error: code = NotFound desc = could not find container \"7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56\": container with ID starting with 7f8d68b875027de4b1be919aac5a42645c572b6200419cafb3e9ca9dc2abba56 not found: ID does not exist" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.263178 4762 scope.go:117] "RemoveContainer" containerID="bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2" Feb 17 18:10:20 crc kubenswrapper[4762]: E0217 18:10:20.263903 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2\": container with ID starting with bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2 not found: ID does not exist" containerID="bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.263938 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2"} err="failed to get container status \"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2\": rpc error: code = NotFound desc = could not find container \"bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2\": container with ID starting with bb13e01a4e53b8dbf8b03a8dca6f91d3b93699795992b761ddd1247db7cf92c2 not found: ID does not exist" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.263972 4762 scope.go:117] "RemoveContainer" containerID="030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.283311 4762 scope.go:117] "RemoveContainer" containerID="0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.297335 4762 scope.go:117] "RemoveContainer" containerID="030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d" Feb 17 18:10:20 crc kubenswrapper[4762]: E0217 18:10:20.297719 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d\": container with ID starting with 030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d not found: ID does not exist" containerID="030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.297751 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d"} err="failed to get container status \"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d\": rpc error: code = NotFound desc = could not find container \"030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d\": container with ID starting with 030a85d432f806af3ca082d70bf2aa2177d3542faa6740a3fa7834e3c3c8dc3d not found: ID does not exist" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.297774 4762 scope.go:117] "RemoveContainer" containerID="0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886" Feb 17 18:10:20 crc kubenswrapper[4762]: E0217 18:10:20.298197 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886\": container with ID starting with 0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886 not found: ID does not exist" containerID="0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.298291 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886"} err="failed to get container status \"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886\": rpc error: code = NotFound desc = could not find container \"0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886\": container with ID starting with 0b69c05ad7e1dfb0a6978030285f49b7ac793b1439ce7ffe550de82f676ec886 not found: ID does not exist" Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.718652 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.719684 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-log" containerID="cri-o://6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77" gracePeriod=30 Feb 17 18:10:20 crc kubenswrapper[4762]: I0217 18:10:20.719774 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-httpd" containerID="cri-o://60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f" gracePeriod=30 Feb 17 18:10:21 crc kubenswrapper[4762]: I0217 18:10:21.044688 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" path="/var/lib/kubelet/pods/eb5ec1b3-9751-411f-a9c6-733f7d4e83d1/volumes" Feb 17 18:10:21 crc kubenswrapper[4762]: I0217 18:10:21.045938 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" path="/var/lib/kubelet/pods/fbf6bdac-22d8-4139-8949-6ee2b97c9c6e/volumes" Feb 17 18:10:21 crc kubenswrapper[4762]: I0217 18:10:21.199866 4762 generic.go:334] "Generic (PLEG): container finished" podID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerID="6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77" exitCode=143 Feb 17 18:10:21 crc kubenswrapper[4762]: I0217 18:10:21.199927 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerDied","Data":"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77"} Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.561518 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.684908 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.684981 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685005 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc68r\" (UniqueName: \"kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685022 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685022 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys" (OuterVolumeSpecName: "sys") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685045 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685101 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685119 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685136 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685173 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685192 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685213 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685260 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules\") pod \"626e4e49-97a1-48c5-91aa-b612406db7b8\" (UID: \"626e4e49-97a1-48c5-91aa-b612406db7b8\") " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685515 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685568 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685809 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run" (OuterVolumeSpecName: "run") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685837 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev" (OuterVolumeSpecName: "dev") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685866 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685886 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.685895 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.686061 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.686290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs" (OuterVolumeSpecName: "logs") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.690237 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.690253 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r" (OuterVolumeSpecName: "kube-api-access-xc68r") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "kube-api-access-xc68r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.690372 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts" (OuterVolumeSpecName: "scripts") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.691760 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.722811 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data" (OuterVolumeSpecName: "config-data") pod "626e4e49-97a1-48c5-91aa-b612406db7b8" (UID: "626e4e49-97a1-48c5-91aa-b612406db7b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786787 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786826 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786861 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786873 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786884 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786895 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626e4e49-97a1-48c5-91aa-b612406db7b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786909 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc68r\" (UniqueName: \"kubernetes.io/projected/626e4e49-97a1-48c5-91aa-b612406db7b8-kube-api-access-xc68r\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786929 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786943 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786953 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786963 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786973 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626e4e49-97a1-48c5-91aa-b612406db7b8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.786984 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/626e4e49-97a1-48c5-91aa-b612406db7b8-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.802522 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.802921 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.888240 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:24 crc kubenswrapper[4762]: I0217 18:10:24.888270 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.242964 4762 generic.go:334] "Generic (PLEG): container finished" podID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerID="60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f" exitCode=0 Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.243015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerDied","Data":"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f"} Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.243380 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"626e4e49-97a1-48c5-91aa-b612406db7b8","Type":"ContainerDied","Data":"f7c253f30d1e46949c6ef0e624358c33a2051b5cb59610deadbee9421f0c61e6"} Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.243407 4762 scope.go:117] "RemoveContainer" containerID="60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.243054 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.293024 4762 scope.go:117] "RemoveContainer" containerID="6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.305361 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.310216 4762 scope.go:117] "RemoveContainer" containerID="60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f" Feb 17 18:10:25 crc kubenswrapper[4762]: E0217 18:10:25.310712 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f\": container with ID starting with 60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f not found: ID does not exist" containerID="60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.310755 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f"} err="failed to get container status \"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f\": rpc error: code = NotFound desc = could not find container \"60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f\": container with ID starting with 60c1c165158af2265942e0aac724ac971de20ce77618d96998b512df9214213f not found: ID does not exist" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.310789 4762 scope.go:117] "RemoveContainer" containerID="6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77" Feb 17 18:10:25 crc kubenswrapper[4762]: E0217 18:10:25.315101 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77\": container with ID starting with 6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77 not found: ID does not exist" containerID="6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.315132 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77"} err="failed to get container status \"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77\": rpc error: code = NotFound desc = could not find container \"6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77\": container with ID starting with 6bc4c9c6431940bc0d742a051bc73347312d20b45288c68527b9422b10ea1a77 not found: ID does not exist" Feb 17 18:10:25 crc kubenswrapper[4762]: I0217 18:10:25.319893 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.021683 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-f8lg8"] Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.035591 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-f8lg8"] Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.089145 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance8c99-account-delete-dr2dm"] Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.089641 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.089713 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.089779 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.089828 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.089894 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.089947 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.090001 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090048 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.090102 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090155 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: E0217 18:10:26.090205 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090280 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090536 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090603 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090676 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf6bdac-22d8-4139-8949-6ee2b97c9c6e" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090741 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-log" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090798 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5ec1b3-9751-411f-a9c6-733f7d4e83d1" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.090858 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" containerName="glance-httpd" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.091369 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.103413 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance8c99-account-delete-dr2dm"] Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.208143 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hds2q\" (UniqueName: \"kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.208211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.309774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hds2q\" (UniqueName: \"kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.309869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.310657 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.336848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hds2q\" (UniqueName: \"kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q\") pod \"glance8c99-account-delete-dr2dm\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.411914 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:26 crc kubenswrapper[4762]: I0217 18:10:26.878344 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance8c99-account-delete-dr2dm"] Feb 17 18:10:27 crc kubenswrapper[4762]: I0217 18:10:27.052921 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20676a86-92bc-497c-8bdd-54e5d7483c3c" path="/var/lib/kubelet/pods/20676a86-92bc-497c-8bdd-54e5d7483c3c/volumes" Feb 17 18:10:27 crc kubenswrapper[4762]: I0217 18:10:27.053702 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626e4e49-97a1-48c5-91aa-b612406db7b8" path="/var/lib/kubelet/pods/626e4e49-97a1-48c5-91aa-b612406db7b8/volumes" Feb 17 18:10:27 crc kubenswrapper[4762]: I0217 18:10:27.260359 4762 generic.go:334] "Generic (PLEG): container finished" podID="cd8bfd77-576a-43ff-88fc-611aa196dc62" containerID="04bb74d115faf16a7128615cd39c11e5e93ab518c8a30409dcaf77a8a9c70bc3" exitCode=0 Feb 17 18:10:27 crc kubenswrapper[4762]: I0217 18:10:27.260421 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" event={"ID":"cd8bfd77-576a-43ff-88fc-611aa196dc62","Type":"ContainerDied","Data":"04bb74d115faf16a7128615cd39c11e5e93ab518c8a30409dcaf77a8a9c70bc3"} Feb 17 18:10:27 crc kubenswrapper[4762]: I0217 18:10:27.260463 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" event={"ID":"cd8bfd77-576a-43ff-88fc-611aa196dc62","Type":"ContainerStarted","Data":"257b1513dd681e9a6cb40a8444d8bdc64459562fa4b55a62dc0722d2e3444729"} Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.583930 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.752937 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hds2q\" (UniqueName: \"kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q\") pod \"cd8bfd77-576a-43ff-88fc-611aa196dc62\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.753039 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts\") pod \"cd8bfd77-576a-43ff-88fc-611aa196dc62\" (UID: \"cd8bfd77-576a-43ff-88fc-611aa196dc62\") " Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.754046 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd8bfd77-576a-43ff-88fc-611aa196dc62" (UID: "cd8bfd77-576a-43ff-88fc-611aa196dc62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.760998 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q" (OuterVolumeSpecName: "kube-api-access-hds2q") pod "cd8bfd77-576a-43ff-88fc-611aa196dc62" (UID: "cd8bfd77-576a-43ff-88fc-611aa196dc62"). InnerVolumeSpecName "kube-api-access-hds2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.855007 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hds2q\" (UniqueName: \"kubernetes.io/projected/cd8bfd77-576a-43ff-88fc-611aa196dc62-kube-api-access-hds2q\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:28 crc kubenswrapper[4762]: I0217 18:10:28.855068 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd8bfd77-576a-43ff-88fc-611aa196dc62-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.127176 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:10:29 crc kubenswrapper[4762]: E0217 18:10:29.127770 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8bfd77-576a-43ff-88fc-611aa196dc62" containerName="mariadb-account-delete" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.127859 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8bfd77-576a-43ff-88fc-611aa196dc62" containerName="mariadb-account-delete" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.128067 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8bfd77-576a-43ff-88fc-611aa196dc62" containerName="mariadb-account-delete" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.128661 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.130483 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-h7wsp" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.130650 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.131068 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.136988 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.137477 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.259553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.259647 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.259710 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vtf\" (UniqueName: \"kubernetes.io/projected/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-kube-api-access-f9vtf\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.259740 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-scripts\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.274800 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" event={"ID":"cd8bfd77-576a-43ff-88fc-611aa196dc62","Type":"ContainerDied","Data":"257b1513dd681e9a6cb40a8444d8bdc64459562fa4b55a62dc0722d2e3444729"} Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.274836 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257b1513dd681e9a6cb40a8444d8bdc64459562fa4b55a62dc0722d2e3444729" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.274850 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8c99-account-delete-dr2dm" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.361051 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.361137 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vtf\" (UniqueName: \"kubernetes.io/projected/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-kube-api-access-f9vtf\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.361165 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-scripts\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.361239 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.362036 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.363994 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-scripts\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.367511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.380130 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vtf\" (UniqueName: \"kubernetes.io/projected/eb0f2fa4-3b42-480e-b4c9-76d81b32a758-kube-api-access-f9vtf\") pod \"openstackclient\" (UID: \"eb0f2fa4-3b42-480e-b4c9-76d81b32a758\") " pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.443857 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 18:10:29 crc kubenswrapper[4762]: W0217 18:10:29.849731 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb0f2fa4_3b42_480e_b4c9_76d81b32a758.slice/crio-0349eceefdc0b2848c3046877c32b6027a3f9f8dc7b704479ec10648f7a04407 WatchSource:0}: Error finding container 0349eceefdc0b2848c3046877c32b6027a3f9f8dc7b704479ec10648f7a04407: Status 404 returned error can't find the container with id 0349eceefdc0b2848c3046877c32b6027a3f9f8dc7b704479ec10648f7a04407 Feb 17 18:10:29 crc kubenswrapper[4762]: I0217 18:10:29.859866 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 18:10:30 crc kubenswrapper[4762]: I0217 18:10:30.283415 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"eb0f2fa4-3b42-480e-b4c9-76d81b32a758","Type":"ContainerStarted","Data":"ad891c2fe669b1ab6cc973df4bbece8528ed425569e55e1251f45abc41ca7576"} Feb 17 18:10:30 crc kubenswrapper[4762]: I0217 18:10:30.283843 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"eb0f2fa4-3b42-480e-b4c9-76d81b32a758","Type":"ContainerStarted","Data":"0349eceefdc0b2848c3046877c32b6027a3f9f8dc7b704479ec10648f7a04407"} Feb 17 18:10:30 crc kubenswrapper[4762]: I0217 18:10:30.301283 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.301191006 podStartE2EDuration="1.301191006s" podCreationTimestamp="2026-02-17 18:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:30.298555961 +0000 UTC m=+1381.943474031" watchObservedRunningTime="2026-02-17 18:10:30.301191006 +0000 UTC m=+1381.946109056" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.110881 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-xpjjp"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.117170 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-xpjjp"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.134737 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-8c99-account-create-update-8qf6v"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.144149 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance8c99-account-delete-dr2dm"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.152324 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-8c99-account-create-update-8qf6v"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.157483 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance8c99-account-delete-dr2dm"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.196668 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-7b8c4"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.197662 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.208959 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-7b8c4"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.290208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.290837 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cj9\" (UniqueName: \"kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.308264 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5f79-account-create-update-khjfs"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.310022 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.312740 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.313287 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5f79-account-create-update-khjfs"] Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.392218 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cj9\" (UniqueName: \"kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.392404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.393154 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.414751 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cj9\" (UniqueName: \"kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9\") pod \"glance-db-create-7b8c4\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.493573 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxdp\" (UniqueName: \"kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.493933 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.511456 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.594868 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxdp\" (UniqueName: \"kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.594970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.595690 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.611573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxdp\" (UniqueName: \"kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp\") pod \"glance-5f79-account-create-update-khjfs\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.629500 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:31 crc kubenswrapper[4762]: I0217 18:10:31.967842 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-7b8c4"] Feb 17 18:10:31 crc kubenswrapper[4762]: W0217 18:10:31.967918 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa2f592_8607_4156_8a42_e3b2f0d5ab50.slice/crio-67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43 WatchSource:0}: Error finding container 67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43: Status 404 returned error can't find the container with id 67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43 Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.060525 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5f79-account-create-update-khjfs"] Feb 17 18:10:32 crc kubenswrapper[4762]: W0217 18:10:32.068289 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61af31a_dda6_45a3_97e7_d2c5271235e3.slice/crio-afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92 WatchSource:0}: Error finding container afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92: Status 404 returned error can't find the container with id afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92 Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.313583 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" event={"ID":"b61af31a-dda6-45a3-97e7-d2c5271235e3","Type":"ContainerStarted","Data":"b3675b1e2a2efb64b22a53c49628bb3522aee2a610dc2f1f330b613c8fd1c4d7"} Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.313651 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" event={"ID":"b61af31a-dda6-45a3-97e7-d2c5271235e3","Type":"ContainerStarted","Data":"afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92"} Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.318389 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7b8c4" event={"ID":"0aa2f592-8607-4156-8a42-e3b2f0d5ab50","Type":"ContainerStarted","Data":"3e5ac99f54d26fb23138b45fb01cf1815fca73cfc160fe03d2f76a4b13c9b163"} Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.318439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7b8c4" event={"ID":"0aa2f592-8607-4156-8a42-e3b2f0d5ab50","Type":"ContainerStarted","Data":"67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43"} Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.335593 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" podStartSLOduration=1.335571071 podStartE2EDuration="1.335571071s" podCreationTimestamp="2026-02-17 18:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:32.32921185 +0000 UTC m=+1383.974129870" watchObservedRunningTime="2026-02-17 18:10:32.335571071 +0000 UTC m=+1383.980489081" Feb 17 18:10:32 crc kubenswrapper[4762]: I0217 18:10:32.346182 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-7b8c4" podStartSLOduration=1.346161092 podStartE2EDuration="1.346161092s" podCreationTimestamp="2026-02-17 18:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:32.343865537 +0000 UTC m=+1383.988783547" watchObservedRunningTime="2026-02-17 18:10:32.346161092 +0000 UTC m=+1383.991079102" Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.042452 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084491d6-9b5a-4a3c-b8b5-c887667fa167" path="/var/lib/kubelet/pods/084491d6-9b5a-4a3c-b8b5-c887667fa167/volumes" Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.043021 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6e1e23-e91b-4d8d-a520-4d599a2d1538" path="/var/lib/kubelet/pods/cd6e1e23-e91b-4d8d-a520-4d599a2d1538/volumes" Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.043476 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8bfd77-576a-43ff-88fc-611aa196dc62" path="/var/lib/kubelet/pods/cd8bfd77-576a-43ff-88fc-611aa196dc62/volumes" Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.326319 4762 generic.go:334] "Generic (PLEG): container finished" podID="b61af31a-dda6-45a3-97e7-d2c5271235e3" containerID="b3675b1e2a2efb64b22a53c49628bb3522aee2a610dc2f1f330b613c8fd1c4d7" exitCode=0 Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.326389 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" event={"ID":"b61af31a-dda6-45a3-97e7-d2c5271235e3","Type":"ContainerDied","Data":"b3675b1e2a2efb64b22a53c49628bb3522aee2a610dc2f1f330b613c8fd1c4d7"} Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.328812 4762 generic.go:334] "Generic (PLEG): container finished" podID="0aa2f592-8607-4156-8a42-e3b2f0d5ab50" containerID="3e5ac99f54d26fb23138b45fb01cf1815fca73cfc160fe03d2f76a4b13c9b163" exitCode=0 Feb 17 18:10:33 crc kubenswrapper[4762]: I0217 18:10:33.328852 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7b8c4" event={"ID":"0aa2f592-8607-4156-8a42-e3b2f0d5ab50","Type":"ContainerDied","Data":"3e5ac99f54d26fb23138b45fb01cf1815fca73cfc160fe03d2f76a4b13c9b163"} Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.651907 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.658974 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.743041 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8cj9\" (UniqueName: \"kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9\") pod \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.743081 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts\") pod \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\" (UID: \"0aa2f592-8607-4156-8a42-e3b2f0d5ab50\") " Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.743223 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts\") pod \"b61af31a-dda6-45a3-97e7-d2c5271235e3\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.743368 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkxdp\" (UniqueName: \"kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp\") pod \"b61af31a-dda6-45a3-97e7-d2c5271235e3\" (UID: \"b61af31a-dda6-45a3-97e7-d2c5271235e3\") " Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.743950 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b61af31a-dda6-45a3-97e7-d2c5271235e3" (UID: "b61af31a-dda6-45a3-97e7-d2c5271235e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.744264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0aa2f592-8607-4156-8a42-e3b2f0d5ab50" (UID: "0aa2f592-8607-4156-8a42-e3b2f0d5ab50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.748130 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp" (OuterVolumeSpecName: "kube-api-access-tkxdp") pod "b61af31a-dda6-45a3-97e7-d2c5271235e3" (UID: "b61af31a-dda6-45a3-97e7-d2c5271235e3"). InnerVolumeSpecName "kube-api-access-tkxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.750030 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9" (OuterVolumeSpecName: "kube-api-access-l8cj9") pod "0aa2f592-8607-4156-8a42-e3b2f0d5ab50" (UID: "0aa2f592-8607-4156-8a42-e3b2f0d5ab50"). InnerVolumeSpecName "kube-api-access-l8cj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.844691 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8cj9\" (UniqueName: \"kubernetes.io/projected/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-kube-api-access-l8cj9\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.844721 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa2f592-8607-4156-8a42-e3b2f0d5ab50-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.844733 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61af31a-dda6-45a3-97e7-d2c5271235e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:34 crc kubenswrapper[4762]: I0217 18:10:34.844747 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkxdp\" (UniqueName: \"kubernetes.io/projected/b61af31a-dda6-45a3-97e7-d2c5271235e3-kube-api-access-tkxdp\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.344567 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" event={"ID":"b61af31a-dda6-45a3-97e7-d2c5271235e3","Type":"ContainerDied","Data":"afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92"} Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.344601 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5f79-account-create-update-khjfs" Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.344606 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb7efbf7bbbe575d4d9ea582d3e6750286e49abe0a6e4ce3f25ee8378ea2e92" Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.347292 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7b8c4" event={"ID":"0aa2f592-8607-4156-8a42-e3b2f0d5ab50","Type":"ContainerDied","Data":"67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43"} Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.347320 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67e53c459ff77b4984cf7bd9d16213064cf071f82dff232b11b3ff6e8a1fac43" Feb 17 18:10:35 crc kubenswrapper[4762]: I0217 18:10:35.347364 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7b8c4" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.520190 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-cpzsw"] Feb 17 18:10:36 crc kubenswrapper[4762]: E0217 18:10:36.520522 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61af31a-dda6-45a3-97e7-d2c5271235e3" containerName="mariadb-account-create-update" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.520539 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61af31a-dda6-45a3-97e7-d2c5271235e3" containerName="mariadb-account-create-update" Feb 17 18:10:36 crc kubenswrapper[4762]: E0217 18:10:36.520553 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa2f592-8607-4156-8a42-e3b2f0d5ab50" containerName="mariadb-database-create" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.520561 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa2f592-8607-4156-8a42-e3b2f0d5ab50" containerName="mariadb-database-create" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.520749 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa2f592-8607-4156-8a42-e3b2f0d5ab50" containerName="mariadb-database-create" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.520769 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61af31a-dda6-45a3-97e7-d2c5271235e3" containerName="mariadb-account-create-update" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.521371 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.524351 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.524540 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-wlv6g" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.531576 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cpzsw"] Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.690491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.690590 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bszq4\" (UniqueName: \"kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.690685 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.792232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bszq4\" (UniqueName: \"kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.792314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.792369 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.796840 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.809688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.810387 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bszq4\" (UniqueName: \"kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4\") pod \"glance-db-sync-cpzsw\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:36 crc kubenswrapper[4762]: I0217 18:10:36.846280 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:37 crc kubenswrapper[4762]: I0217 18:10:37.305613 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cpzsw"] Feb 17 18:10:37 crc kubenswrapper[4762]: I0217 18:10:37.361336 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cpzsw" event={"ID":"cb5ca87d-b094-4631-a254-f190fa5c5822","Type":"ContainerStarted","Data":"00cc983b6853663ce5fe12567e3540abb18304a5b841c78c8bca5761edb4dae0"} Feb 17 18:10:38 crc kubenswrapper[4762]: I0217 18:10:38.369570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cpzsw" event={"ID":"cb5ca87d-b094-4631-a254-f190fa5c5822","Type":"ContainerStarted","Data":"fbc162e39929170a1335ace70c80f005e1c9009c6446fc5146e86f1e0cedd123"} Feb 17 18:10:38 crc kubenswrapper[4762]: I0217 18:10:38.386661 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-cpzsw" podStartSLOduration=2.386642103 podStartE2EDuration="2.386642103s" podCreationTimestamp="2026-02-17 18:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:38.383735991 +0000 UTC m=+1390.028654001" watchObservedRunningTime="2026-02-17 18:10:38.386642103 +0000 UTC m=+1390.031560113" Feb 17 18:10:41 crc kubenswrapper[4762]: I0217 18:10:41.402420 4762 generic.go:334] "Generic (PLEG): container finished" podID="cb5ca87d-b094-4631-a254-f190fa5c5822" containerID="fbc162e39929170a1335ace70c80f005e1c9009c6446fc5146e86f1e0cedd123" exitCode=0 Feb 17 18:10:41 crc kubenswrapper[4762]: I0217 18:10:41.402508 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cpzsw" event={"ID":"cb5ca87d-b094-4631-a254-f190fa5c5822","Type":"ContainerDied","Data":"fbc162e39929170a1335ace70c80f005e1c9009c6446fc5146e86f1e0cedd123"} Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.731977 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.876048 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bszq4\" (UniqueName: \"kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4\") pod \"cb5ca87d-b094-4631-a254-f190fa5c5822\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.876097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data\") pod \"cb5ca87d-b094-4631-a254-f190fa5c5822\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.876144 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data\") pod \"cb5ca87d-b094-4631-a254-f190fa5c5822\" (UID: \"cb5ca87d-b094-4631-a254-f190fa5c5822\") " Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.881196 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4" (OuterVolumeSpecName: "kube-api-access-bszq4") pod "cb5ca87d-b094-4631-a254-f190fa5c5822" (UID: "cb5ca87d-b094-4631-a254-f190fa5c5822"). InnerVolumeSpecName "kube-api-access-bszq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.882798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cb5ca87d-b094-4631-a254-f190fa5c5822" (UID: "cb5ca87d-b094-4631-a254-f190fa5c5822"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.911791 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data" (OuterVolumeSpecName: "config-data") pod "cb5ca87d-b094-4631-a254-f190fa5c5822" (UID: "cb5ca87d-b094-4631-a254-f190fa5c5822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.977796 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.977832 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bszq4\" (UniqueName: \"kubernetes.io/projected/cb5ca87d-b094-4631-a254-f190fa5c5822-kube-api-access-bszq4\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:42 crc kubenswrapper[4762]: I0217 18:10:42.977846 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5ca87d-b094-4631-a254-f190fa5c5822-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:43 crc kubenswrapper[4762]: I0217 18:10:43.421856 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cpzsw" Feb 17 18:10:43 crc kubenswrapper[4762]: I0217 18:10:43.421884 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cpzsw" event={"ID":"cb5ca87d-b094-4631-a254-f190fa5c5822","Type":"ContainerDied","Data":"00cc983b6853663ce5fe12567e3540abb18304a5b841c78c8bca5761edb4dae0"} Feb 17 18:10:43 crc kubenswrapper[4762]: I0217 18:10:43.422320 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00cc983b6853663ce5fe12567e3540abb18304a5b841c78c8bca5761edb4dae0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.401984 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:10:44 crc kubenswrapper[4762]: E0217 18:10:44.402276 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5ca87d-b094-4631-a254-f190fa5c5822" containerName="glance-db-sync" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.402291 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5ca87d-b094-4631-a254-f190fa5c5822" containerName="glance-db-sync" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.402461 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5ca87d-b094-4631-a254-f190fa5c5822" containerName="glance-db-sync" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.403289 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.405235 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-wlv6g" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.405587 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.410084 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.414721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600220 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khfs5\" (UniqueName: \"kubernetes.io/projected/3419cbbe-0b7e-4c04-925f-1a741ff25114-kube-api-access-khfs5\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600288 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600337 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600361 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-scripts\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600432 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-sys\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600500 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-config-data\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600568 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600644 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-logs\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600671 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-dev\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600702 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600729 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.600758 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702546 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-sys\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702571 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-config-data\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702656 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-sys\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702665 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702712 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702755 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-logs\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-dev\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702793 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702807 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702833 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khfs5\" (UniqueName: \"kubernetes.io/projected/3419cbbe-0b7e-4c04-925f-1a741ff25114-kube-api-access-khfs5\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.702887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703281 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-scripts\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703391 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703569 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703936 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703960 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.703971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3419cbbe-0b7e-4c04-925f-1a741ff25114-dev\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.704057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-logs\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.704103 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3419cbbe-0b7e-4c04-925f-1a741ff25114-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.708555 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.710759 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-config-data\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.714395 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419cbbe-0b7e-4c04-925f-1a741ff25114-scripts\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.755868 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.767835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.769307 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.781595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.787806 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khfs5\" (UniqueName: \"kubernetes.io/projected/3419cbbe-0b7e-4c04-925f-1a741ff25114-kube-api-access-khfs5\") pod \"glance-default-external-api-1\" (UID: \"3419cbbe-0b7e-4c04-925f-1a741ff25114\") " pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.797012 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.810690 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.812989 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.816949 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.818686 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.819101 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.844632 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.852433 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.905528 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.905578 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm46n\" (UniqueName: \"kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.905867 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.905957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.905998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906056 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906113 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906139 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906312 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906353 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906375 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906392 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906416 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906459 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906477 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906498 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906515 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906540 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906569 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906584 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zm2\" (UniqueName: \"kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906638 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906655 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:44 crc kubenswrapper[4762]: I0217 18:10:44.906691 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007736 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007792 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007823 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007856 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007890 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007928 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.007944 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008036 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008072 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008110 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm46n\" (UniqueName: \"kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008279 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008327 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008368 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008403 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008422 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008533 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8xhf\" (UniqueName: \"kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008478 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008565 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008606 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008681 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008928 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.008966 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009044 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009080 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009149 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009195 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009230 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009260 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009370 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009398 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009434 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009551 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009601 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009674 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009706 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.009745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zm2\" (UniqueName: \"kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010067 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010118 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010151 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010166 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010194 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010489 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010197 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010252 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010339 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010375 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.010524 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.014612 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.018909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.018987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.019865 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.035489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.061819 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.067872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm46n\" (UniqueName: \"kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.076378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.076770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.079265 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zm2\" (UniqueName: \"kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2\") pod \"glance-default-external-api-0\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.112014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.112500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.112704 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.112874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.113025 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.113358 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.113564 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.113935 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8xhf\" (UniqueName: \"kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114284 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114579 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114776 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.114818 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.115065 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.115696 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.115727 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.116451 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.116499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.116530 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.116948 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.117442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.117510 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.117770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.123029 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.129507 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.142210 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.146232 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.152358 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.157684 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8xhf\" (UniqueName: \"kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.160800 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.164559 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.167512 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:45 crc kubenswrapper[4762]: E0217 18:10:45.213173 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-storage-0" podUID="ae866fa5-748d-4935-a3d2-2fe08bc9693f" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.438293 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.597685 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.657952 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:10:45 crc kubenswrapper[4762]: W0217 18:10:45.664438 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a02bd31_330d_41d8_be42_f78d0036a9b2.slice/crio-fa960f6806f14c89906601f66c3458e9f770f70e65863c8f34439cda60f8fc7c WatchSource:0}: Error finding container fa960f6806f14c89906601f66c3458e9f770f70e65863c8f34439cda60f8fc7c: Status 404 returned error can't find the container with id fa960f6806f14c89906601f66c3458e9f770f70e65863c8f34439cda60f8fc7c Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.730324 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.759640 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:10:45 crc kubenswrapper[4762]: I0217 18:10:45.779157 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:45 crc kubenswrapper[4762]: W0217 18:10:45.783115 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3176010_3aaa_4b0c_b6d2_c23e0583b7a5.slice/crio-b2f926a6bdcbadc3a62145834cd5915ad819aff7a99423eba0d5aeaabb9554f3 WatchSource:0}: Error finding container b2f926a6bdcbadc3a62145834cd5915ad819aff7a99423eba0d5aeaabb9554f3: Status 404 returned error can't find the container with id b2f926a6bdcbadc3a62145834cd5915ad819aff7a99423eba0d5aeaabb9554f3 Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.448021 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerStarted","Data":"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.448553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerStarted","Data":"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.448565 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerStarted","Data":"f63a7d61b6b27fab2b01fc2089282f9bdb523fc5360b5f05aa3403422727f3ce"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.451418 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"3419cbbe-0b7e-4c04-925f-1a741ff25114","Type":"ContainerStarted","Data":"dcc15dc64b91786ab523ae5035c042c7ac9254c1ba1b3c0a3a04133f637d6ffc"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.451457 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"3419cbbe-0b7e-4c04-925f-1a741ff25114","Type":"ContainerStarted","Data":"02dd6210e79cdd301b10066f078d7260bcfe29f8b4b35773c5b04d7268addb6e"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.451469 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"3419cbbe-0b7e-4c04-925f-1a741ff25114","Type":"ContainerStarted","Data":"ac4e44a5e7215ae73d69089a633378570740b4f80c4dc5759781fa29b87d3167"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.454109 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerStarted","Data":"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.454142 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerStarted","Data":"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.454154 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerStarted","Data":"fa960f6806f14c89906601f66c3458e9f770f70e65863c8f34439cda60f8fc7c"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.456271 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerStarted","Data":"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.456299 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerStarted","Data":"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.456313 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerStarted","Data":"b2f926a6bdcbadc3a62145834cd5915ad819aff7a99423eba0d5aeaabb9554f3"} Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.456413 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-log" containerID="cri-o://e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" gracePeriod=30 Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.456547 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-httpd" containerID="cri-o://24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" gracePeriod=30 Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.469051 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.469034966 podStartE2EDuration="3.469034966s" podCreationTimestamp="2026-02-17 18:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:46.468078738 +0000 UTC m=+1398.112996748" watchObservedRunningTime="2026-02-17 18:10:46.469034966 +0000 UTC m=+1398.113952966" Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.492202 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.4921793340000002 podStartE2EDuration="2.492179334s" podCreationTimestamp="2026-02-17 18:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:46.488209851 +0000 UTC m=+1398.133127871" watchObservedRunningTime="2026-02-17 18:10:46.492179334 +0000 UTC m=+1398.137097354" Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.515584 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.515550008 podStartE2EDuration="3.515550008s" podCreationTimestamp="2026-02-17 18:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:46.515090335 +0000 UTC m=+1398.160008345" watchObservedRunningTime="2026-02-17 18:10:46.515550008 +0000 UTC m=+1398.160468018" Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.539262 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.539233712 podStartE2EDuration="3.539233712s" podCreationTimestamp="2026-02-17 18:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:46.537179243 +0000 UTC m=+1398.182097253" watchObservedRunningTime="2026-02-17 18:10:46.539233712 +0000 UTC m=+1398.184151722" Feb 17 18:10:46 crc kubenswrapper[4762]: I0217 18:10:46.893600 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048269 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048373 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048432 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048453 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048487 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048506 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048608 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048682 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048744 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048858 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048891 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.048925 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8xhf\" (UniqueName: \"kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf\") pod \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\" (UID: \"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5\") " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050530 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys" (OuterVolumeSpecName: "sys") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050578 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run" (OuterVolumeSpecName: "run") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050642 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050666 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev" (OuterVolumeSpecName: "dev") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050688 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050711 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050813 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs" (OuterVolumeSpecName: "logs") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.050846 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.056143 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts" (OuterVolumeSpecName: "scripts") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.056306 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.056571 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf" (OuterVolumeSpecName: "kube-api-access-j8xhf") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "kube-api-access-j8xhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.066748 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.089785 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data" (OuterVolumeSpecName: "config-data") pod "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" (UID: "d3176010-3aaa-4b0c-b6d2-c23e0583b7a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.150926 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151215 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151294 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151353 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151405 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151459 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151516 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151574 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151655 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151712 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.151795 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.152230 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.152368 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8xhf\" (UniqueName: \"kubernetes.io/projected/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-kube-api-access-j8xhf\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.152453 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.165812 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.167054 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.254084 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.254113 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466484 4762 generic.go:334] "Generic (PLEG): container finished" podID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerID="24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" exitCode=143 Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466516 4762 generic.go:334] "Generic (PLEG): container finished" podID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerID="e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" exitCode=143 Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerDied","Data":"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e"} Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466582 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerDied","Data":"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69"} Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466652 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"d3176010-3aaa-4b0c-b6d2-c23e0583b7a5","Type":"ContainerDied","Data":"b2f926a6bdcbadc3a62145834cd5915ad819aff7a99423eba0d5aeaabb9554f3"} Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.466663 4762 scope.go:117] "RemoveContainer" containerID="24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.487842 4762 scope.go:117] "RemoveContainer" containerID="e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.505978 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.513520 4762 scope.go:117] "RemoveContainer" containerID="24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" Feb 17 18:10:47 crc kubenswrapper[4762]: E0217 18:10:47.514348 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e\": container with ID starting with 24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e not found: ID does not exist" containerID="24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.514380 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e"} err="failed to get container status \"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e\": rpc error: code = NotFound desc = could not find container \"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e\": container with ID starting with 24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e not found: ID does not exist" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.514400 4762 scope.go:117] "RemoveContainer" containerID="e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" Feb 17 18:10:47 crc kubenswrapper[4762]: E0217 18:10:47.514755 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69\": container with ID starting with e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69 not found: ID does not exist" containerID="e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.514783 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69"} err="failed to get container status \"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69\": rpc error: code = NotFound desc = could not find container \"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69\": container with ID starting with e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69 not found: ID does not exist" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.514799 4762 scope.go:117] "RemoveContainer" containerID="24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.515014 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e"} err="failed to get container status \"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e\": rpc error: code = NotFound desc = could not find container \"24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e\": container with ID starting with 24e067d232c7c3bc720be2c77785f76cb18d95115ef3369a4341943509b2595e not found: ID does not exist" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.515033 4762 scope.go:117] "RemoveContainer" containerID="e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.515211 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69"} err="failed to get container status \"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69\": rpc error: code = NotFound desc = could not find container \"e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69\": container with ID starting with e0af1bfb1c647866d36eef3457321bb3b335b74baebb53f09141aaa418ce2d69 not found: ID does not exist" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.522559 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.532372 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:47 crc kubenswrapper[4762]: E0217 18:10:47.532803 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-httpd" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.532829 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-httpd" Feb 17 18:10:47 crc kubenswrapper[4762]: E0217 18:10:47.532866 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-log" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.532879 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-log" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.533092 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-log" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.533119 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" containerName="glance-httpd" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.534142 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.540853 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660478 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660514 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-logs\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660545 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-sys\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660580 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-config-data\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660599 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660617 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599xq\" (UniqueName: \"kubernetes.io/projected/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-kube-api-access-599xq\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660664 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-dev\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660697 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660733 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-scripts\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660768 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.660786 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.762310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.762703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-scripts\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.762821 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.762925 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763291 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763411 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-logs\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763529 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-sys\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763682 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.763844 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-config-data\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.764174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.764311 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599xq\" (UniqueName: \"kubernetes.io/projected/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-kube-api-access-599xq\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.764413 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-dev\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.764603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-dev\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.764751 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.766190 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-logs\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.766273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.766476 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.766796 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.769888 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-scripts\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.770292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-sys\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.770468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.772816 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.772857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-run\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.772885 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.775034 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-config-data\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.797726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599xq\" (UniqueName: \"kubernetes.io/projected/08bd38b3-2e1b-4517-b07c-4c027b71f9fc-kube-api-access-599xq\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.808534 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.814134 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"08bd38b3-2e1b-4517-b07c-4c027b71f9fc\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:47 crc kubenswrapper[4762]: I0217 18:10:47.848724 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.067958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:10:48 crc kubenswrapper[4762]: E0217 18:10:48.069188 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:10:48 crc kubenswrapper[4762]: E0217 18:10:48.069221 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:10:48 crc kubenswrapper[4762]: E0217 18:10:48.069266 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift podName:ae866fa5-748d-4935-a3d2-2fe08bc9693f nodeName:}" failed. No retries permitted until 2026-02-17 18:12:50.069246355 +0000 UTC m=+1521.714164365 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift") pod "swift-storage-0" (UID: "ae866fa5-748d-4935-a3d2-2fe08bc9693f") : configmap "swift-ring-files" not found Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.087023 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Feb 17 18:10:48 crc kubenswrapper[4762]: E0217 18:10:48.242667 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" podUID="e576e3fe-21e1-4867-adcc-bb586e3a5921" Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.476579 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"08bd38b3-2e1b-4517-b07c-4c027b71f9fc","Type":"ContainerStarted","Data":"552c4a2bc84e55f7b56563f2877acdf5d76d056aec0f18be6cb5c0afa7c760b9"} Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.478472 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"08bd38b3-2e1b-4517-b07c-4c027b71f9fc","Type":"ContainerStarted","Data":"0b4eafe099a3dce0aef647e90def95aff59296ae2e727882ff1ddb595ea75398"} Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.478570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"08bd38b3-2e1b-4517-b07c-4c027b71f9fc","Type":"ContainerStarted","Data":"dca5807732f3b66df6a712323140bac35a31dc714a22c9ef3bb5f03710a952a0"} Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.478854 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:10:48 crc kubenswrapper[4762]: I0217 18:10:48.496836 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=1.496775511 podStartE2EDuration="1.496775511s" podCreationTimestamp="2026-02-17 18:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:10:48.493999032 +0000 UTC m=+1400.138917042" watchObservedRunningTime="2026-02-17 18:10:48.496775511 +0000 UTC m=+1400.141693521" Feb 17 18:10:49 crc kubenswrapper[4762]: I0217 18:10:49.051082 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3176010-3aaa-4b0c-b6d2-c23e0583b7a5" path="/var/lib/kubelet/pods/d3176010-3aaa-4b0c-b6d2-c23e0583b7a5/volumes" Feb 17 18:10:50 crc kubenswrapper[4762]: I0217 18:10:50.514791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:10:50 crc kubenswrapper[4762]: E0217 18:10:50.514988 4762 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:10:50 crc kubenswrapper[4762]: E0217 18:10:50.515200 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7: configmap "swift-ring-files" not found Feb 17 18:10:50 crc kubenswrapper[4762]: E0217 18:10:50.515261 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift podName:e576e3fe-21e1-4867-adcc-bb586e3a5921 nodeName:}" failed. No retries permitted until 2026-02-17 18:12:52.515233724 +0000 UTC m=+1524.160151804 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift") pod "swift-proxy-5f6df75b65-p6tm7" (UID: "e576e3fe-21e1-4867-adcc-bb586e3a5921") : configmap "swift-ring-files" not found Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.020597 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.021793 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.045986 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.070890 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.142709 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.142767 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.155898 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.155954 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.170375 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.181533 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.210412 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.212101 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539184 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539232 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539250 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539271 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539306 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:55 crc kubenswrapper[4762]: I0217 18:10:55.539321 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.548896 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.549237 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.548923 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.549334 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.599516 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.604123 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.650241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.650336 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.682017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.685217 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.739577 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.783128 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.849337 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.849381 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.872385 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:57 crc kubenswrapper[4762]: I0217 18:10:57.884050 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:58 crc kubenswrapper[4762]: I0217 18:10:58.559236 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:58 crc kubenswrapper[4762]: I0217 18:10:58.561317 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:10:59 crc kubenswrapper[4762]: I0217 18:10:59.565831 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-log" containerID="cri-o://f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9" gracePeriod=30 Feb 17 18:10:59 crc kubenswrapper[4762]: I0217 18:10:59.566361 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-httpd" containerID="cri-o://6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee" gracePeriod=30 Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.573545 4762 generic.go:334] "Generic (PLEG): container finished" podID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerID="f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9" exitCode=143 Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.573643 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerDied","Data":"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9"} Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.573941 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.573953 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.600711 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.602465 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.658893 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.659190 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-log" containerID="cri-o://7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6" gracePeriod=30 Feb 17 18:11:00 crc kubenswrapper[4762]: I0217 18:11:00.659332 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-httpd" containerID="cri-o://4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149" gracePeriod=30 Feb 17 18:11:01 crc kubenswrapper[4762]: I0217 18:11:01.583159 4762 generic.go:334] "Generic (PLEG): container finished" podID="13a4848c-ddb1-4915-9615-8470fac770de" containerID="7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6" exitCode=143 Feb 17 18:11:01 crc kubenswrapper[4762]: I0217 18:11:01.583965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerDied","Data":"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6"} Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.044183 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.107862 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6zm2\" (UniqueName: \"kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.107925 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.107952 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.107987 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108025 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108044 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108103 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108134 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108124 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run" (OuterVolumeSpecName: "run") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108190 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108194 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108248 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108304 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108328 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108303 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108333 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys" (OuterVolumeSpecName: "sys") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108374 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108435 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev" (OuterVolumeSpecName: "dev") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108481 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi\") pod \"1a02bd31-330d-41d8-be42-f78d0036a9b2\" (UID: \"1a02bd31-330d-41d8-be42-f78d0036a9b2\") " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108486 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs" (OuterVolumeSpecName: "logs") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108518 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108630 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.108988 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109007 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109019 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109032 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109042 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109052 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109062 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109073 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a02bd31-330d-41d8-be42-f78d0036a9b2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.109084 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1a02bd31-330d-41d8-be42-f78d0036a9b2-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.113423 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.113672 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.115479 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts" (OuterVolumeSpecName: "scripts") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.118004 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2" (OuterVolumeSpecName: "kube-api-access-z6zm2") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "kube-api-access-z6zm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.143583 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data" (OuterVolumeSpecName: "config-data") pod "1a02bd31-330d-41d8-be42-f78d0036a9b2" (UID: "1a02bd31-330d-41d8-be42-f78d0036a9b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.210462 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.210497 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6zm2\" (UniqueName: \"kubernetes.io/projected/1a02bd31-330d-41d8-be42-f78d0036a9b2-kube-api-access-z6zm2\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.210512 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.210520 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.210529 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a02bd31-330d-41d8-be42-f78d0036a9b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.223892 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.224387 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.311952 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.311985 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.600746 4762 generic.go:334] "Generic (PLEG): container finished" podID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerID="6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee" exitCode=0 Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.600793 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.600791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerDied","Data":"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee"} Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.600975 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"1a02bd31-330d-41d8-be42-f78d0036a9b2","Type":"ContainerDied","Data":"fa960f6806f14c89906601f66c3458e9f770f70e65863c8f34439cda60f8fc7c"} Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.601020 4762 scope.go:117] "RemoveContainer" containerID="6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.628087 4762 scope.go:117] "RemoveContainer" containerID="f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.630902 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.637452 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.652223 4762 scope.go:117] "RemoveContainer" containerID="6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee" Feb 17 18:11:03 crc kubenswrapper[4762]: E0217 18:11:03.652677 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee\": container with ID starting with 6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee not found: ID does not exist" containerID="6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.652724 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee"} err="failed to get container status \"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee\": rpc error: code = NotFound desc = could not find container \"6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee\": container with ID starting with 6be606ee08df9baa33d54ae1f968a338cec01b2f37e48a946f99967d375328ee not found: ID does not exist" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.652754 4762 scope.go:117] "RemoveContainer" containerID="f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9" Feb 17 18:11:03 crc kubenswrapper[4762]: E0217 18:11:03.653137 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9\": container with ID starting with f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9 not found: ID does not exist" containerID="f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.653154 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9"} err="failed to get container status \"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9\": rpc error: code = NotFound desc = could not find container \"f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9\": container with ID starting with f905608162dfd421abdbb36db7a64f6bf1793daf254e110d678a8521b1caaae9 not found: ID does not exist" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.665899 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:11:03 crc kubenswrapper[4762]: E0217 18:11:03.666175 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-log" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.666191 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-log" Feb 17 18:11:03 crc kubenswrapper[4762]: E0217 18:11:03.666204 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-httpd" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.666211 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-httpd" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.666360 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-log" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.666372 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" containerName="glance-httpd" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.667119 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.679894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717354 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717410 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4vx\" (UniqueName: \"kubernetes.io/projected/4994f27b-c494-4a5e-8867-1d3f3ee6a766-kube-api-access-8l4vx\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717445 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-scripts\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717469 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717507 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-config-data\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717530 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-sys\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-dev\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717574 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717639 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717690 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717756 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717775 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.717813 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-logs\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819018 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-logs\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819105 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819166 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4vx\" (UniqueName: \"kubernetes.io/projected/4994f27b-c494-4a5e-8867-1d3f3ee6a766-kube-api-access-8l4vx\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-scripts\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819215 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819254 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-config-data\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819280 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-sys\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-dev\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819320 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819363 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819410 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819438 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819451 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819550 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-sys\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-dev\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819610 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819614 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-logs\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819694 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819718 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819470 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.819733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4994f27b-c494-4a5e-8867-1d3f3ee6a766-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.820051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4994f27b-c494-4a5e-8867-1d3f3ee6a766-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.822900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-scripts\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.823349 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4994f27b-c494-4a5e-8867-1d3f3ee6a766-config-data\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.846125 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4vx\" (UniqueName: \"kubernetes.io/projected/4994f27b-c494-4a5e-8867-1d3f3ee6a766-kube-api-access-8l4vx\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.852187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.854945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4994f27b-c494-4a5e-8867-1d3f3ee6a766\") " pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:03 crc kubenswrapper[4762]: I0217 18:11:03.984484 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.154489 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226463 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226513 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226518 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226549 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226561 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226613 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226716 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226741 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226762 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226821 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226880 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226900 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226908 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs" (OuterVolumeSpecName: "logs") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226936 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm46n\" (UniqueName: \"kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n\") pod \"13a4848c-ddb1-4915-9615-8470fac770de\" (UID: \"13a4848c-ddb1-4915-9615-8470fac770de\") " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226938 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys" (OuterVolumeSpecName: "sys") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226959 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run" (OuterVolumeSpecName: "run") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.226978 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227216 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227229 4762 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227240 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227248 4762 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227256 4762 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227264 4762 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227273 4762 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-sys\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.227804 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev" (OuterVolumeSpecName: "dev") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.228153 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.232956 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.233014 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.233508 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts" (OuterVolumeSpecName: "scripts") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.233541 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n" (OuterVolumeSpecName: "kube-api-access-zm46n") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "kube-api-access-zm46n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.275819 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data" (OuterVolumeSpecName: "config-data") pod "13a4848c-ddb1-4915-9615-8470fac770de" (UID: "13a4848c-ddb1-4915-9615-8470fac770de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328769 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328813 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a4848c-ddb1-4915-9615-8470fac770de-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328826 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm46n\" (UniqueName: \"kubernetes.io/projected/13a4848c-ddb1-4915-9615-8470fac770de-kube-api-access-zm46n\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328839 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a4848c-ddb1-4915-9615-8470fac770de-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328873 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328889 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328899 4762 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-dev\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.328908 4762 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13a4848c-ddb1-4915-9615-8470fac770de-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.343294 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.344252 4762 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.396650 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Feb 17 18:11:04 crc kubenswrapper[4762]: W0217 18:11:04.397260 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4994f27b_c494_4a5e_8867_1d3f3ee6a766.slice/crio-95483d2be0bd1b0a9493f647ad1b0e1854ef3fe71e07a0bff1adaa6a45d6728a WatchSource:0}: Error finding container 95483d2be0bd1b0a9493f647ad1b0e1854ef3fe71e07a0bff1adaa6a45d6728a: Status 404 returned error can't find the container with id 95483d2be0bd1b0a9493f647ad1b0e1854ef3fe71e07a0bff1adaa6a45d6728a Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.430719 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.430761 4762 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.610410 4762 generic.go:334] "Generic (PLEG): container finished" podID="13a4848c-ddb1-4915-9615-8470fac770de" containerID="4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149" exitCode=0 Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.610471 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerDied","Data":"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149"} Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.610912 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"13a4848c-ddb1-4915-9615-8470fac770de","Type":"ContainerDied","Data":"f63a7d61b6b27fab2b01fc2089282f9bdb523fc5360b5f05aa3403422727f3ce"} Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.610936 4762 scope.go:117] "RemoveContainer" containerID="4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.610489 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.612892 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"4994f27b-c494-4a5e-8867-1d3f3ee6a766","Type":"ContainerStarted","Data":"b2d11a0c9b57d6b48b4d949eaec52dee9a6844484fabaa22fe674f4474f54154"} Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.613049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"4994f27b-c494-4a5e-8867-1d3f3ee6a766","Type":"ContainerStarted","Data":"95483d2be0bd1b0a9493f647ad1b0e1854ef3fe71e07a0bff1adaa6a45d6728a"} Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.632782 4762 scope.go:117] "RemoveContainer" containerID="7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.656879 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.657440 4762 scope.go:117] "RemoveContainer" containerID="4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149" Feb 17 18:11:04 crc kubenswrapper[4762]: E0217 18:11:04.658115 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149\": container with ID starting with 4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149 not found: ID does not exist" containerID="4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.658258 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149"} err="failed to get container status \"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149\": rpc error: code = NotFound desc = could not find container \"4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149\": container with ID starting with 4f63397d02da2527dafbcd6f97d5ade6e002e782e6e5809ae3de30b140f81149 not found: ID does not exist" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.658370 4762 scope.go:117] "RemoveContainer" containerID="7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6" Feb 17 18:11:04 crc kubenswrapper[4762]: E0217 18:11:04.659199 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6\": container with ID starting with 7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6 not found: ID does not exist" containerID="7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.659278 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6"} err="failed to get container status \"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6\": rpc error: code = NotFound desc = could not find container \"7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6\": container with ID starting with 7d926116ed316e7df2c8d6f90dbff2e3afcf6fd9daa4f659ef752857ccbfc3f6 not found: ID does not exist" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.667905 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.684701 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:04 crc kubenswrapper[4762]: E0217 18:11:04.685027 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-httpd" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.685044 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-httpd" Feb 17 18:11:04 crc kubenswrapper[4762]: E0217 18:11:04.685062 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-log" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.685069 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-log" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.685216 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-log" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.685226 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a4848c-ddb1-4915-9615-8470fac770de" containerName="glance-httpd" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.686010 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.693508 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736261 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-sys\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736343 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736437 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshnm\" (UniqueName: \"kubernetes.io/projected/6020f61b-1c5c-4266-941c-6b18ce30c5c7-kube-api-access-vshnm\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736515 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736592 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736707 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.736916 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.737016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.737167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.737208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-dev\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.737237 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.737288 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838407 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838851 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-sys\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838927 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshnm\" (UniqueName: \"kubernetes.io/projected/6020f61b-1c5c-4266-941c-6b18ce30c5c7-kube-api-access-vshnm\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838972 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839086 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839131 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839227 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-dev\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839314 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-dev\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.838606 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.839425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-sys\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.840232 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.840459 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.841439 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.841515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.841980 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.842029 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6020f61b-1c5c-4266-941c-6b18ce30c5c7-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.842263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6020f61b-1c5c-4266-941c-6b18ce30c5c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.847595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.857756 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6020f61b-1c5c-4266-941c-6b18ce30c5c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.861321 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshnm\" (UniqueName: \"kubernetes.io/projected/6020f61b-1c5c-4266-941c-6b18ce30c5c7-kube-api-access-vshnm\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.861369 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:04 crc kubenswrapper[4762]: I0217 18:11:04.866481 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"6020f61b-1c5c-4266-941c-6b18ce30c5c7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.002202 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.044530 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a4848c-ddb1-4915-9615-8470fac770de" path="/var/lib/kubelet/pods/13a4848c-ddb1-4915-9615-8470fac770de/volumes" Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.045332 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a02bd31-330d-41d8-be42-f78d0036a9b2" path="/var/lib/kubelet/pods/1a02bd31-330d-41d8-be42-f78d0036a9b2/volumes" Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.419979 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Feb 17 18:11:05 crc kubenswrapper[4762]: W0217 18:11:05.421492 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6020f61b_1c5c_4266_941c_6b18ce30c5c7.slice/crio-03a5965767f9acace2fbd2dc6ee60b78c4d9f1a9982d81de5a88feeabe8d97ba WatchSource:0}: Error finding container 03a5965767f9acace2fbd2dc6ee60b78c4d9f1a9982d81de5a88feeabe8d97ba: Status 404 returned error can't find the container with id 03a5965767f9acace2fbd2dc6ee60b78c4d9f1a9982d81de5a88feeabe8d97ba Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.624456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6020f61b-1c5c-4266-941c-6b18ce30c5c7","Type":"ContainerStarted","Data":"5606e6650c84ee2bdccb9e389421db111c3f4bf616c9560c1310d25140e22eb5"} Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.624865 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6020f61b-1c5c-4266-941c-6b18ce30c5c7","Type":"ContainerStarted","Data":"03a5965767f9acace2fbd2dc6ee60b78c4d9f1a9982d81de5a88feeabe8d97ba"} Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.628899 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"4994f27b-c494-4a5e-8867-1d3f3ee6a766","Type":"ContainerStarted","Data":"b44924b2a2141fbe865551a1f3109cce57fb210058d51d4acc164f667cae2eea"} Feb 17 18:11:05 crc kubenswrapper[4762]: I0217 18:11:05.664383 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.66436138 podStartE2EDuration="2.66436138s" podCreationTimestamp="2026-02-17 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:11:05.656206488 +0000 UTC m=+1417.301124498" watchObservedRunningTime="2026-02-17 18:11:05.66436138 +0000 UTC m=+1417.309279390" Feb 17 18:11:06 crc kubenswrapper[4762]: I0217 18:11:06.636004 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6020f61b-1c5c-4266-941c-6b18ce30c5c7","Type":"ContainerStarted","Data":"782663d1edfc6804ae8a6ea1b56b44f2f941d5e4185fb6745751680221cb4865"} Feb 17 18:11:06 crc kubenswrapper[4762]: I0217 18:11:06.656830 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.656800468 podStartE2EDuration="2.656800468s" podCreationTimestamp="2026-02-17 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:11:06.654382089 +0000 UTC m=+1418.299300109" watchObservedRunningTime="2026-02-17 18:11:06.656800468 +0000 UTC m=+1418.301718478" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.027089 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.028720 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.054828 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.116612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.116680 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.116712 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z5p\" (UniqueName: \"kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.219823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.219872 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.219900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z5p\" (UniqueName: \"kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.220352 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.220515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.257084 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z5p\" (UniqueName: \"kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p\") pod \"redhat-operators-4jbz5\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.366772 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:09 crc kubenswrapper[4762]: I0217 18:11:09.829065 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:09 crc kubenswrapper[4762]: W0217 18:11:09.836184 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30e5040e_3923_4a8b_afe3_d7f08222c64f.slice/crio-449d235bebbf9df1f3d800bd8c234ab3f416b77fe130b47ab3d32f23cfbf7d84 WatchSource:0}: Error finding container 449d235bebbf9df1f3d800bd8c234ab3f416b77fe130b47ab3d32f23cfbf7d84: Status 404 returned error can't find the container with id 449d235bebbf9df1f3d800bd8c234ab3f416b77fe130b47ab3d32f23cfbf7d84 Feb 17 18:11:10 crc kubenswrapper[4762]: I0217 18:11:10.663658 4762 generic.go:334] "Generic (PLEG): container finished" podID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerID="35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13" exitCode=0 Feb 17 18:11:10 crc kubenswrapper[4762]: I0217 18:11:10.663944 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerDied","Data":"35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13"} Feb 17 18:11:10 crc kubenswrapper[4762]: I0217 18:11:10.663971 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerStarted","Data":"449d235bebbf9df1f3d800bd8c234ab3f416b77fe130b47ab3d32f23cfbf7d84"} Feb 17 18:11:10 crc kubenswrapper[4762]: I0217 18:11:10.665521 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:11:11 crc kubenswrapper[4762]: I0217 18:11:11.673405 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerStarted","Data":"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5"} Feb 17 18:11:12 crc kubenswrapper[4762]: I0217 18:11:12.682278 4762 generic.go:334] "Generic (PLEG): container finished" podID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerID="3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5" exitCode=0 Feb 17 18:11:12 crc kubenswrapper[4762]: I0217 18:11:12.682315 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerDied","Data":"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5"} Feb 17 18:11:13 crc kubenswrapper[4762]: I0217 18:11:13.700892 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerStarted","Data":"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489"} Feb 17 18:11:13 crc kubenswrapper[4762]: I0217 18:11:13.724644 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4jbz5" podStartSLOduration=2.326043232 podStartE2EDuration="4.724611671s" podCreationTimestamp="2026-02-17 18:11:09 +0000 UTC" firstStartedPulling="2026-02-17 18:11:10.665327755 +0000 UTC m=+1422.310245765" lastFinishedPulling="2026-02-17 18:11:13.063896204 +0000 UTC m=+1424.708814204" observedRunningTime="2026-02-17 18:11:13.719079664 +0000 UTC m=+1425.363997674" watchObservedRunningTime="2026-02-17 18:11:13.724611671 +0000 UTC m=+1425.369529681" Feb 17 18:11:13 crc kubenswrapper[4762]: I0217 18:11:13.985348 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:13 crc kubenswrapper[4762]: I0217 18:11:13.985399 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:14 crc kubenswrapper[4762]: I0217 18:11:14.017640 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:14 crc kubenswrapper[4762]: I0217 18:11:14.030786 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:14 crc kubenswrapper[4762]: I0217 18:11:14.708731 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:14 crc kubenswrapper[4762]: I0217 18:11:14.709137 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.002368 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.002426 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.027085 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.048561 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.716245 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:15 crc kubenswrapper[4762]: I0217 18:11:15.716284 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:16 crc kubenswrapper[4762]: I0217 18:11:16.635730 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:16 crc kubenswrapper[4762]: I0217 18:11:16.720562 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:11:16 crc kubenswrapper[4762]: I0217 18:11:16.823100 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Feb 17 18:11:17 crc kubenswrapper[4762]: I0217 18:11:17.730000 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:11:17 crc kubenswrapper[4762]: I0217 18:11:17.730060 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:11:17 crc kubenswrapper[4762]: I0217 18:11:17.806204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:17 crc kubenswrapper[4762]: I0217 18:11:17.977277 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Feb 17 18:11:19 crc kubenswrapper[4762]: I0217 18:11:19.367563 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:19 crc kubenswrapper[4762]: I0217 18:11:19.367949 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:19 crc kubenswrapper[4762]: I0217 18:11:19.417546 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:19 crc kubenswrapper[4762]: I0217 18:11:19.791127 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:19 crc kubenswrapper[4762]: I0217 18:11:19.834027 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:21 crc kubenswrapper[4762]: I0217 18:11:21.757274 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4jbz5" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="registry-server" containerID="cri-o://11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489" gracePeriod=2 Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.109645 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.168331 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities\") pod \"30e5040e-3923-4a8b-afe3-d7f08222c64f\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.168376 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content\") pod \"30e5040e-3923-4a8b-afe3-d7f08222c64f\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.168426 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z5p\" (UniqueName: \"kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p\") pod \"30e5040e-3923-4a8b-afe3-d7f08222c64f\" (UID: \"30e5040e-3923-4a8b-afe3-d7f08222c64f\") " Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.174543 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities" (OuterVolumeSpecName: "utilities") pod "30e5040e-3923-4a8b-afe3-d7f08222c64f" (UID: "30e5040e-3923-4a8b-afe3-d7f08222c64f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.184274 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p" (OuterVolumeSpecName: "kube-api-access-m4z5p") pod "30e5040e-3923-4a8b-afe3-d7f08222c64f" (UID: "30e5040e-3923-4a8b-afe3-d7f08222c64f"). InnerVolumeSpecName "kube-api-access-m4z5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.269908 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.269936 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z5p\" (UniqueName: \"kubernetes.io/projected/30e5040e-3923-4a8b-afe3-d7f08222c64f-kube-api-access-m4z5p\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.768448 4762 generic.go:334] "Generic (PLEG): container finished" podID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerID="11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489" exitCode=0 Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.768495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerDied","Data":"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489"} Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.768536 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jbz5" event={"ID":"30e5040e-3923-4a8b-afe3-d7f08222c64f","Type":"ContainerDied","Data":"449d235bebbf9df1f3d800bd8c234ab3f416b77fe130b47ab3d32f23cfbf7d84"} Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.768556 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jbz5" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.768563 4762 scope.go:117] "RemoveContainer" containerID="11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.787461 4762 scope.go:117] "RemoveContainer" containerID="3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.808879 4762 scope.go:117] "RemoveContainer" containerID="35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.834023 4762 scope.go:117] "RemoveContainer" containerID="11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489" Feb 17 18:11:22 crc kubenswrapper[4762]: E0217 18:11:22.834538 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489\": container with ID starting with 11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489 not found: ID does not exist" containerID="11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.834583 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489"} err="failed to get container status \"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489\": rpc error: code = NotFound desc = could not find container \"11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489\": container with ID starting with 11e83146e8e4de3f01f3b8d438842b5000869bc18b1389e3919d1e6221c2d489 not found: ID does not exist" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.834609 4762 scope.go:117] "RemoveContainer" containerID="3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5" Feb 17 18:11:22 crc kubenswrapper[4762]: E0217 18:11:22.835160 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5\": container with ID starting with 3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5 not found: ID does not exist" containerID="3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.835218 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5"} err="failed to get container status \"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5\": rpc error: code = NotFound desc = could not find container \"3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5\": container with ID starting with 3447d3b4339394ae7612984a77e3fbf70f4c831be2ea082274cc09f0eaed12b5 not found: ID does not exist" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.835245 4762 scope.go:117] "RemoveContainer" containerID="35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13" Feb 17 18:11:22 crc kubenswrapper[4762]: E0217 18:11:22.835805 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13\": container with ID starting with 35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13 not found: ID does not exist" containerID="35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13" Feb 17 18:11:22 crc kubenswrapper[4762]: I0217 18:11:22.835835 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13"} err="failed to get container status \"35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13\": rpc error: code = NotFound desc = could not find container \"35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13\": container with ID starting with 35156e3bc440c1c1a68e135ffe1648196192393080c00a9e624a449763aadb13 not found: ID does not exist" Feb 17 18:11:24 crc kubenswrapper[4762]: I0217 18:11:24.609293 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30e5040e-3923-4a8b-afe3-d7f08222c64f" (UID: "30e5040e-3923-4a8b-afe3-d7f08222c64f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:11:24 crc kubenswrapper[4762]: I0217 18:11:24.612228 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e5040e-3923-4a8b-afe3-d7f08222c64f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:11:24 crc kubenswrapper[4762]: I0217 18:11:24.921384 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:24 crc kubenswrapper[4762]: I0217 18:11:24.929345 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4jbz5"] Feb 17 18:11:25 crc kubenswrapper[4762]: I0217 18:11:25.044857 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" path="/var/lib/kubelet/pods/30e5040e-3923-4a8b-afe3-d7f08222c64f/volumes" Feb 17 18:11:30 crc kubenswrapper[4762]: I0217 18:11:30.055610 4762 scope.go:117] "RemoveContainer" containerID="51ec8b176ceb5176438c0db7debb8f393fbcdcd2f3b380a83f2788441c519854" Feb 17 18:11:30 crc kubenswrapper[4762]: I0217 18:11:30.076769 4762 scope.go:117] "RemoveContainer" containerID="021a4ca038c82f7b4a3651311de17a17a2a68d0d2baf7f4c92b48abcdac9185a" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.247657 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-drbf5"] Feb 17 18:12:03 crc kubenswrapper[4762]: E0217 18:12:03.249790 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="registry-server" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.249878 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="registry-server" Feb 17 18:12:03 crc kubenswrapper[4762]: E0217 18:12:03.249969 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="extract-utilities" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.250039 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="extract-utilities" Feb 17 18:12:03 crc kubenswrapper[4762]: E0217 18:12:03.250097 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="extract-content" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.250151 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="extract-content" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.250345 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e5040e-3923-4a8b-afe3-d7f08222c64f" containerName="registry-server" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.250980 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.252793 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.253205 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.256916 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-drbf5"] Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323129 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhgpf\" (UniqueName: \"kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323185 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323207 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323225 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323330 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.323420 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425105 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhgpf\" (UniqueName: \"kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425298 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425355 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425386 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.425421 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.426098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.426327 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.427380 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.438263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.438300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.440543 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhgpf\" (UniqueName: \"kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf\") pod \"swift-ring-rebalance-drbf5\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.570520 4762 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-xxl98" Feb 17 18:12:03 crc kubenswrapper[4762]: I0217 18:12:03.578757 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:04 crc kubenswrapper[4762]: I0217 18:12:04.005897 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-drbf5"] Feb 17 18:12:04 crc kubenswrapper[4762]: I0217 18:12:04.101542 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" event={"ID":"f94a7cb6-015a-4a94-8a90-b34d2790a272","Type":"ContainerStarted","Data":"71e5ab4e9fb0e61c93441889e84a88a32a58b8a72a848bda3bf0c2e4a57eacc2"} Feb 17 18:12:04 crc kubenswrapper[4762]: I0217 18:12:04.558145 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:12:04 crc kubenswrapper[4762]: I0217 18:12:04.558214 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.538592 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.541797 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.563514 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.578703 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.578774 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.578813 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shxgk\" (UniqueName: \"kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.682388 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.682434 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.682460 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shxgk\" (UniqueName: \"kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.683141 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.683252 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.717683 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shxgk\" (UniqueName: \"kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk\") pod \"redhat-marketplace-jklxw\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:09 crc kubenswrapper[4762]: I0217 18:12:09.868918 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:10 crc kubenswrapper[4762]: I0217 18:12:10.482766 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:11 crc kubenswrapper[4762]: I0217 18:12:11.170381 4762 generic.go:334] "Generic (PLEG): container finished" podID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerID="e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde" exitCode=0 Feb 17 18:12:11 crc kubenswrapper[4762]: I0217 18:12:11.170726 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerDied","Data":"e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde"} Feb 17 18:12:11 crc kubenswrapper[4762]: I0217 18:12:11.170752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerStarted","Data":"83f01e39c9523cc88c8d15d70fe96140c7a2dbb8a3685efc9f4ef068b456efbe"} Feb 17 18:12:11 crc kubenswrapper[4762]: I0217 18:12:11.177123 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" event={"ID":"f94a7cb6-015a-4a94-8a90-b34d2790a272","Type":"ContainerStarted","Data":"741d06c037e83909fbe0f2591a68cadc450967bd79471b66cf47f9c5da902e9c"} Feb 17 18:12:11 crc kubenswrapper[4762]: I0217 18:12:11.207244 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" podStartSLOduration=2.162013827 podStartE2EDuration="8.207227825s" podCreationTimestamp="2026-02-17 18:12:03 +0000 UTC" firstStartedPulling="2026-02-17 18:12:04.01909711 +0000 UTC m=+1475.664015120" lastFinishedPulling="2026-02-17 18:12:10.064311108 +0000 UTC m=+1481.709229118" observedRunningTime="2026-02-17 18:12:11.204367273 +0000 UTC m=+1482.849285283" watchObservedRunningTime="2026-02-17 18:12:11.207227825 +0000 UTC m=+1482.852145835" Feb 17 18:12:13 crc kubenswrapper[4762]: I0217 18:12:13.049086 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-gtlzz"] Feb 17 18:12:13 crc kubenswrapper[4762]: I0217 18:12:13.054577 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-gtlzz"] Feb 17 18:12:13 crc kubenswrapper[4762]: I0217 18:12:13.199799 4762 generic.go:334] "Generic (PLEG): container finished" podID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerID="beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e" exitCode=0 Feb 17 18:12:13 crc kubenswrapper[4762]: I0217 18:12:13.199865 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerDied","Data":"beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e"} Feb 17 18:12:14 crc kubenswrapper[4762]: I0217 18:12:14.210194 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerStarted","Data":"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240"} Feb 17 18:12:15 crc kubenswrapper[4762]: I0217 18:12:15.044294 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885f2c17-dddb-4f85-90bf-90ba0e38255a" path="/var/lib/kubelet/pods/885f2c17-dddb-4f85-90bf-90ba0e38255a/volumes" Feb 17 18:12:15 crc kubenswrapper[4762]: I0217 18:12:15.234370 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jklxw" podStartSLOduration=3.448651645 podStartE2EDuration="6.234348802s" podCreationTimestamp="2026-02-17 18:12:09 +0000 UTC" firstStartedPulling="2026-02-17 18:12:11.174024551 +0000 UTC m=+1482.818942561" lastFinishedPulling="2026-02-17 18:12:13.959721708 +0000 UTC m=+1485.604639718" observedRunningTime="2026-02-17 18:12:15.230289396 +0000 UTC m=+1486.875207416" watchObservedRunningTime="2026-02-17 18:12:15.234348802 +0000 UTC m=+1486.879266812" Feb 17 18:12:18 crc kubenswrapper[4762]: I0217 18:12:18.261716 4762 generic.go:334] "Generic (PLEG): container finished" podID="f94a7cb6-015a-4a94-8a90-b34d2790a272" containerID="741d06c037e83909fbe0f2591a68cadc450967bd79471b66cf47f9c5da902e9c" exitCode=0 Feb 17 18:12:18 crc kubenswrapper[4762]: I0217 18:12:18.261948 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" event={"ID":"f94a7cb6-015a-4a94-8a90-b34d2790a272","Type":"ContainerDied","Data":"741d06c037e83909fbe0f2591a68cadc450967bd79471b66cf47f9c5da902e9c"} Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.506218 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683315 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683376 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683500 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683585 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.683610 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhgpf\" (UniqueName: \"kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf\") pod \"f94a7cb6-015a-4a94-8a90-b34d2790a272\" (UID: \"f94a7cb6-015a-4a94-8a90-b34d2790a272\") " Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.684768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.685329 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.699870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf" (OuterVolumeSpecName: "kube-api-access-qhgpf") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "kube-api-access-qhgpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.708012 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.708174 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts" (OuterVolumeSpecName: "scripts") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.709133 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f94a7cb6-015a-4a94-8a90-b34d2790a272" (UID: "f94a7cb6-015a-4a94-8a90-b34d2790a272"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785682 4762 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785724 4762 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f94a7cb6-015a-4a94-8a90-b34d2790a272-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785734 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785743 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhgpf\" (UniqueName: \"kubernetes.io/projected/f94a7cb6-015a-4a94-8a90-b34d2790a272-kube-api-access-qhgpf\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785752 4762 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f94a7cb6-015a-4a94-8a90-b34d2790a272-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.785761 4762 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f94a7cb6-015a-4a94-8a90-b34d2790a272-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.869350 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.869400 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:19 crc kubenswrapper[4762]: I0217 18:12:19.918884 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:20 crc kubenswrapper[4762]: I0217 18:12:20.277447 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" event={"ID":"f94a7cb6-015a-4a94-8a90-b34d2790a272","Type":"ContainerDied","Data":"71e5ab4e9fb0e61c93441889e84a88a32a58b8a72a848bda3bf0c2e4a57eacc2"} Feb 17 18:12:20 crc kubenswrapper[4762]: I0217 18:12:20.277479 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-drbf5" Feb 17 18:12:20 crc kubenswrapper[4762]: I0217 18:12:20.277495 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e5ab4e9fb0e61c93441889e84a88a32a58b8a72a848bda3bf0c2e4a57eacc2" Feb 17 18:12:20 crc kubenswrapper[4762]: I0217 18:12:20.335536 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:20 crc kubenswrapper[4762]: I0217 18:12:20.384219 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.290404 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jklxw" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="registry-server" containerID="cri-o://f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240" gracePeriod=2 Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.649276 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.833899 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities\") pod \"659d6a48-eb8d-43aa-aed3-7f010e5641da\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.833969 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shxgk\" (UniqueName: \"kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk\") pod \"659d6a48-eb8d-43aa-aed3-7f010e5641da\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.833997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content\") pod \"659d6a48-eb8d-43aa-aed3-7f010e5641da\" (UID: \"659d6a48-eb8d-43aa-aed3-7f010e5641da\") " Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.835020 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities" (OuterVolumeSpecName: "utilities") pod "659d6a48-eb8d-43aa-aed3-7f010e5641da" (UID: "659d6a48-eb8d-43aa-aed3-7f010e5641da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.844264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk" (OuterVolumeSpecName: "kube-api-access-shxgk") pod "659d6a48-eb8d-43aa-aed3-7f010e5641da" (UID: "659d6a48-eb8d-43aa-aed3-7f010e5641da"). InnerVolumeSpecName "kube-api-access-shxgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.860525 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "659d6a48-eb8d-43aa-aed3-7f010e5641da" (UID: "659d6a48-eb8d-43aa-aed3-7f010e5641da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.935792 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.935827 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shxgk\" (UniqueName: \"kubernetes.io/projected/659d6a48-eb8d-43aa-aed3-7f010e5641da-kube-api-access-shxgk\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:22 crc kubenswrapper[4762]: I0217 18:12:22.935839 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659d6a48-eb8d-43aa-aed3-7f010e5641da-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.300129 4762 generic.go:334] "Generic (PLEG): container finished" podID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerID="f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240" exitCode=0 Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.300940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerDied","Data":"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240"} Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.301023 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jklxw" event={"ID":"659d6a48-eb8d-43aa-aed3-7f010e5641da","Type":"ContainerDied","Data":"83f01e39c9523cc88c8d15d70fe96140c7a2dbb8a3685efc9f4ef068b456efbe"} Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.301096 4762 scope.go:117] "RemoveContainer" containerID="f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.301272 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jklxw" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.330998 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.332124 4762 scope.go:117] "RemoveContainer" containerID="beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.339218 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jklxw"] Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.352048 4762 scope.go:117] "RemoveContainer" containerID="e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.384779 4762 scope.go:117] "RemoveContainer" containerID="f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240" Feb 17 18:12:23 crc kubenswrapper[4762]: E0217 18:12:23.385285 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240\": container with ID starting with f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240 not found: ID does not exist" containerID="f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.385330 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240"} err="failed to get container status \"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240\": rpc error: code = NotFound desc = could not find container \"f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240\": container with ID starting with f26986c32efd9ed4735335b20050e908bc9240b8db11f64aad29275ef3c0d240 not found: ID does not exist" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.385350 4762 scope.go:117] "RemoveContainer" containerID="beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e" Feb 17 18:12:23 crc kubenswrapper[4762]: E0217 18:12:23.385822 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e\": container with ID starting with beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e not found: ID does not exist" containerID="beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.385862 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e"} err="failed to get container status \"beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e\": rpc error: code = NotFound desc = could not find container \"beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e\": container with ID starting with beca56e75c776d1c3aa82dd0f6679d32b66f737b347cfe02a71871ede0538e1e not found: ID does not exist" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.385888 4762 scope.go:117] "RemoveContainer" containerID="e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde" Feb 17 18:12:23 crc kubenswrapper[4762]: E0217 18:12:23.387249 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde\": container with ID starting with e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde not found: ID does not exist" containerID="e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde" Feb 17 18:12:23 crc kubenswrapper[4762]: I0217 18:12:23.387356 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde"} err="failed to get container status \"e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde\": rpc error: code = NotFound desc = could not find container \"e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde\": container with ID starting with e40f5f4c4a242a6d887cc05a763044f40c19b46dd8cf3c6816a66ca919655dde not found: ID does not exist" Feb 17 18:12:25 crc kubenswrapper[4762]: I0217 18:12:25.066167 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" path="/var/lib/kubelet/pods/659d6a48-eb8d-43aa-aed3-7f010e5641da/volumes" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.000584 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g79x5/must-gather-xs4ng"] Feb 17 18:12:28 crc kubenswrapper[4762]: E0217 18:12:28.001227 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="extract-utilities" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001242 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="extract-utilities" Feb 17 18:12:28 crc kubenswrapper[4762]: E0217 18:12:28.001258 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="extract-content" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001266 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="extract-content" Feb 17 18:12:28 crc kubenswrapper[4762]: E0217 18:12:28.001282 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="registry-server" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001290 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="registry-server" Feb 17 18:12:28 crc kubenswrapper[4762]: E0217 18:12:28.001304 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94a7cb6-015a-4a94-8a90-b34d2790a272" containerName="swift-ring-rebalance" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001312 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94a7cb6-015a-4a94-8a90-b34d2790a272" containerName="swift-ring-rebalance" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001465 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="659d6a48-eb8d-43aa-aed3-7f010e5641da" containerName="registry-server" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.001496 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94a7cb6-015a-4a94-8a90-b34d2790a272" containerName="swift-ring-rebalance" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.002363 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.004350 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g79x5"/"kube-root-ca.crt" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.007331 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g79x5"/"openshift-service-ca.crt" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.011990 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g79x5/must-gather-xs4ng"] Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.019583 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.019720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnqz\" (UniqueName: \"kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.121780 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.121870 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnqz\" (UniqueName: \"kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.122368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.149665 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnqz\" (UniqueName: \"kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz\") pod \"must-gather-xs4ng\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.319469 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:12:28 crc kubenswrapper[4762]: I0217 18:12:28.721166 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g79x5/must-gather-xs4ng"] Feb 17 18:12:29 crc kubenswrapper[4762]: I0217 18:12:29.351714 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g79x5/must-gather-xs4ng" event={"ID":"41630b6a-bae3-4e2b-bd82-ad7c75056f70","Type":"ContainerStarted","Data":"29d74ac07cfc895f35dd340ad9ae44f8e725040944cd7a4b2c3f29a9cfff6469"} Feb 17 18:12:30 crc kubenswrapper[4762]: I0217 18:12:30.190788 4762 scope.go:117] "RemoveContainer" containerID="fd134a566ddf830408d2235b5505933ed2c746d686d40322303111272f2ca1b1" Feb 17 18:12:30 crc kubenswrapper[4762]: I0217 18:12:30.219866 4762 scope.go:117] "RemoveContainer" containerID="9386635a8c4128ae781d33d8791c9d6ee8abef8b0c793ea20872f6424757af25" Feb 17 18:12:34 crc kubenswrapper[4762]: I0217 18:12:34.557992 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:12:34 crc kubenswrapper[4762]: I0217 18:12:34.558536 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:12:35 crc kubenswrapper[4762]: I0217 18:12:35.398749 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g79x5/must-gather-xs4ng" event={"ID":"41630b6a-bae3-4e2b-bd82-ad7c75056f70","Type":"ContainerStarted","Data":"58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae"} Feb 17 18:12:35 crc kubenswrapper[4762]: I0217 18:12:35.399259 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g79x5/must-gather-xs4ng" event={"ID":"41630b6a-bae3-4e2b-bd82-ad7c75056f70","Type":"ContainerStarted","Data":"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013"} Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.398214 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g79x5/must-gather-xs4ng" podStartSLOduration=13.275378912 podStartE2EDuration="19.398197985s" podCreationTimestamp="2026-02-17 18:12:27 +0000 UTC" firstStartedPulling="2026-02-17 18:12:28.72753463 +0000 UTC m=+1500.372452640" lastFinishedPulling="2026-02-17 18:12:34.850353703 +0000 UTC m=+1506.495271713" observedRunningTime="2026-02-17 18:12:35.418854327 +0000 UTC m=+1507.063772337" watchObservedRunningTime="2026-02-17 18:12:46.398197985 +0000 UTC m=+1518.043115995" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.404089 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.405821 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.412442 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.443436 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.443553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clr7\" (UniqueName: \"kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.443586 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.545031 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.545128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clr7\" (UniqueName: \"kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.545163 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.545462 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.545520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.574153 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clr7\" (UniqueName: \"kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7\") pod \"community-operators-cdv68\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:46 crc kubenswrapper[4762]: I0217 18:12:46.727044 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:47 crc kubenswrapper[4762]: I0217 18:12:47.016194 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:12:47 crc kubenswrapper[4762]: I0217 18:12:47.480275 4762 generic.go:334] "Generic (PLEG): container finished" podID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerID="c7da63813340bc0535068a77c1548a0781ae2d348a9b5a533f007083903ede69" exitCode=0 Feb 17 18:12:47 crc kubenswrapper[4762]: I0217 18:12:47.480604 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerDied","Data":"c7da63813340bc0535068a77c1548a0781ae2d348a9b5a533f007083903ede69"} Feb 17 18:12:47 crc kubenswrapper[4762]: I0217 18:12:47.480650 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerStarted","Data":"5e0e2eb2fb6a01f3833496039bfe7ed3821e3c85fa63b64a9d0522263caf3216"} Feb 17 18:12:48 crc kubenswrapper[4762]: E0217 18:12:48.440230 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-storage-0" podUID="ae866fa5-748d-4935-a3d2-2fe08bc9693f" Feb 17 18:12:48 crc kubenswrapper[4762]: I0217 18:12:48.490209 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:12:48 crc kubenswrapper[4762]: I0217 18:12:48.490425 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerStarted","Data":"010410a97042f497aa337c7b01180936a0abd5f8d7b5ad1c09e8fc31158db5db"} Feb 17 18:12:49 crc kubenswrapper[4762]: I0217 18:12:49.499248 4762 generic.go:334] "Generic (PLEG): container finished" podID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerID="010410a97042f497aa337c7b01180936a0abd5f8d7b5ad1c09e8fc31158db5db" exitCode=0 Feb 17 18:12:49 crc kubenswrapper[4762]: I0217 18:12:49.499316 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerDied","Data":"010410a97042f497aa337c7b01180936a0abd5f8d7b5ad1c09e8fc31158db5db"} Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.102377 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.122187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae866fa5-748d-4935-a3d2-2fe08bc9693f-etc-swift\") pod \"swift-storage-0\" (UID: \"ae866fa5-748d-4935-a3d2-2fe08bc9693f\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.293140 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.510477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerStarted","Data":"903739d1fc2218df26d0bdb18f3b156ca9d9aaee6961171700ddff0eb7e63e14"} Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.531746 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cdv68" podStartSLOduration=2.132587382 podStartE2EDuration="4.531729089s" podCreationTimestamp="2026-02-17 18:12:46 +0000 UTC" firstStartedPulling="2026-02-17 18:12:47.482842851 +0000 UTC m=+1519.127760861" lastFinishedPulling="2026-02-17 18:12:49.881984558 +0000 UTC m=+1521.526902568" observedRunningTime="2026-02-17 18:12:50.531116152 +0000 UTC m=+1522.176034182" watchObservedRunningTime="2026-02-17 18:12:50.531729089 +0000 UTC m=+1522.176647109" Feb 17 18:12:50 crc kubenswrapper[4762]: I0217 18:12:50.730442 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 18:12:50 crc kubenswrapper[4762]: W0217 18:12:50.739488 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae866fa5_748d_4935_a3d2_2fe08bc9693f.slice/crio-c25816e8581130721af62c6f78c533733a451282324aa982b85bb81c0085393e WatchSource:0}: Error finding container c25816e8581130721af62c6f78c533733a451282324aa982b85bb81c0085393e: Status 404 returned error can't find the container with id c25816e8581130721af62c6f78c533733a451282324aa982b85bb81c0085393e Feb 17 18:12:51 crc kubenswrapper[4762]: E0217 18:12:51.480469 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" podUID="e576e3fe-21e1-4867-adcc-bb586e3a5921" Feb 17 18:12:51 crc kubenswrapper[4762]: I0217 18:12:51.525233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"c25816e8581130721af62c6f78c533733a451282324aa982b85bb81c0085393e"} Feb 17 18:12:51 crc kubenswrapper[4762]: I0217 18:12:51.525329 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.539710 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.549954 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e576e3fe-21e1-4867-adcc-bb586e3a5921-etc-swift\") pod \"swift-proxy-5f6df75b65-p6tm7\" (UID: \"e576e3fe-21e1-4867-adcc-bb586e3a5921\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.553863 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"db8d99fddc4cfc16372297bfeaca44288dc6b329749dc47eead321ee04853f7f"} Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.553903 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"24a47dba9eea1fdc671e0709a179538325fe22cb0607e74a4204585682a6e955"} Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.553913 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"8746d1a03bea23c1b456c12fe14ec40da19ba4b128e98e374c236a0e37003197"} Feb 17 18:12:52 crc kubenswrapper[4762]: I0217 18:12:52.726493 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:53 crc kubenswrapper[4762]: W0217 18:12:53.198335 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode576e3fe_21e1_4867_adcc_bb586e3a5921.slice/crio-b1dd48435bc9f353050ef39fdbeb11a753dc8abaed10de746b3af42e82756a56 WatchSource:0}: Error finding container b1dd48435bc9f353050ef39fdbeb11a753dc8abaed10de746b3af42e82756a56: Status 404 returned error can't find the container with id b1dd48435bc9f353050ef39fdbeb11a753dc8abaed10de746b3af42e82756a56 Feb 17 18:12:53 crc kubenswrapper[4762]: I0217 18:12:53.206861 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7"] Feb 17 18:12:53 crc kubenswrapper[4762]: I0217 18:12:53.564173 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"a75d497e9f506ad0ce42a48923b3f0ea8fad8970f5446caf60344c9e6a72a002"} Feb 17 18:12:53 crc kubenswrapper[4762]: I0217 18:12:53.565543 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" event={"ID":"e576e3fe-21e1-4867-adcc-bb586e3a5921","Type":"ContainerStarted","Data":"790c58d25c4e9c572adcf17f01886e471f418e83f2e0e44f5595b3892c592ccc"} Feb 17 18:12:53 crc kubenswrapper[4762]: I0217 18:12:53.565566 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" event={"ID":"e576e3fe-21e1-4867-adcc-bb586e3a5921","Type":"ContainerStarted","Data":"b1dd48435bc9f353050ef39fdbeb11a753dc8abaed10de746b3af42e82756a56"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.575361 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"df7d9a2d45233aa408b060f2b9c03620ae93bfbc7dd88d147d023d7a7e40fae1"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.575650 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"4ca35dd8319bd9263cc64658a211fd600fdc7bb9594e6a6a1c6f9b0cda133a9e"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.575660 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"9e93a6bbd7c5e7a1362630a1c77c56bd1848809ef5ae83cb70b3a5c6b1405a04"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.575670 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"41de5c098c7b8c6adc7e671c569ea2e177a5eb496e6444afe02afb0916e157a0"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.578083 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" event={"ID":"e576e3fe-21e1-4867-adcc-bb586e3a5921","Type":"ContainerStarted","Data":"7ba9635c423a2a8eff9769fabde322d76ab66ede91d92605fd27d124ca5204c5"} Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.578958 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:54 crc kubenswrapper[4762]: I0217 18:12:54.578981 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:12:56 crc kubenswrapper[4762]: I0217 18:12:56.729392 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:56 crc kubenswrapper[4762]: I0217 18:12:56.729679 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:56 crc kubenswrapper[4762]: I0217 18:12:56.780427 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:56 crc kubenswrapper[4762]: I0217 18:12:56.799372 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" podStartSLOduration=498.799357363 podStartE2EDuration="8m18.799357363s" podCreationTimestamp="2026-02-17 18:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:12:54.601220538 +0000 UTC m=+1526.246138568" watchObservedRunningTime="2026-02-17 18:12:56.799357363 +0000 UTC m=+1528.444275373" Feb 17 18:12:57 crc kubenswrapper[4762]: I0217 18:12:57.610386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"461b69b205ebacd14ace30f3944ca482f737a0b5866eecaa202ec62905a501c4"} Feb 17 18:12:57 crc kubenswrapper[4762]: I0217 18:12:57.610718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"b10406d6444216d9f5b5763b5bef452e63eaaa85bbe840f798271690cea450f3"} Feb 17 18:12:57 crc kubenswrapper[4762]: I0217 18:12:57.658474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:12:57 crc kubenswrapper[4762]: I0217 18:12:57.707073 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:12:58 crc kubenswrapper[4762]: I0217 18:12:58.624096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"7d4ed13c93a33a8572a42b6e6d7fef5aab5a235229b9f95145ddc41bdbc25227"} Feb 17 18:12:58 crc kubenswrapper[4762]: I0217 18:12:58.624453 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"1a332be80e45887bc9ad0643d3b68ec0d5bf6c9c962176e49ae8dd080967018c"} Feb 17 18:12:58 crc kubenswrapper[4762]: I0217 18:12:58.624473 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"3daedcc78a7e38484d9c2fd62d1cb9d973d08a22add25d37dd7307a639dc4170"} Feb 17 18:12:59 crc kubenswrapper[4762]: I0217 18:12:59.629921 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cdv68" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="registry-server" containerID="cri-o://903739d1fc2218df26d0bdb18f3b156ca9d9aaee6961171700ddff0eb7e63e14" gracePeriod=2 Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.663150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"7649060563d450c5d79c5cfde695d24f21a79d84fc2ccb94786f01ec1a5b0074"} Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.666189 4762 generic.go:334] "Generic (PLEG): container finished" podID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerID="903739d1fc2218df26d0bdb18f3b156ca9d9aaee6961171700ddff0eb7e63e14" exitCode=0 Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.666229 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerDied","Data":"903739d1fc2218df26d0bdb18f3b156ca9d9aaee6961171700ddff0eb7e63e14"} Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.825685 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.881232 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities\") pod \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.881312 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content\") pod \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.881364 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clr7\" (UniqueName: \"kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7\") pod \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\" (UID: \"07d56dc9-b37c-48b5-a7fe-f0856c40e027\") " Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.881950 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities" (OuterVolumeSpecName: "utilities") pod "07d56dc9-b37c-48b5-a7fe-f0856c40e027" (UID: "07d56dc9-b37c-48b5-a7fe-f0856c40e027"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.895336 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7" (OuterVolumeSpecName: "kube-api-access-6clr7") pod "07d56dc9-b37c-48b5-a7fe-f0856c40e027" (UID: "07d56dc9-b37c-48b5-a7fe-f0856c40e027"). InnerVolumeSpecName "kube-api-access-6clr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.927458 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07d56dc9-b37c-48b5-a7fe-f0856c40e027" (UID: "07d56dc9-b37c-48b5-a7fe-f0856c40e027"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.983574 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.983615 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clr7\" (UniqueName: \"kubernetes.io/projected/07d56dc9-b37c-48b5-a7fe-f0856c40e027-kube-api-access-6clr7\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:00 crc kubenswrapper[4762]: I0217 18:13:00.983640 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d56dc9-b37c-48b5-a7fe-f0856c40e027-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.673948 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdv68" event={"ID":"07d56dc9-b37c-48b5-a7fe-f0856c40e027","Type":"ContainerDied","Data":"5e0e2eb2fb6a01f3833496039bfe7ed3821e3c85fa63b64a9d0522263caf3216"} Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.674008 4762 scope.go:117] "RemoveContainer" containerID="903739d1fc2218df26d0bdb18f3b156ca9d9aaee6961171700ddff0eb7e63e14" Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.673949 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdv68" Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.680175 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ae866fa5-748d-4935-a3d2-2fe08bc9693f","Type":"ContainerStarted","Data":"3db58df908b66049c19d8ac863c5d7b8b8802133a707ef6c6601a4f509381ac6"} Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.695058 4762 scope.go:117] "RemoveContainer" containerID="010410a97042f497aa337c7b01180936a0abd5f8d7b5ad1c09e8fc31158db5db" Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.704096 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.712578 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cdv68"] Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.714756 4762 scope.go:117] "RemoveContainer" containerID="c7da63813340bc0535068a77c1548a0781ae2d348a9b5a533f007083903ede69" Feb 17 18:13:01 crc kubenswrapper[4762]: I0217 18:13:01.741714 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=501.27129164 podStartE2EDuration="8m27.741693935s" podCreationTimestamp="2026-02-17 18:04:34 +0000 UTC" firstStartedPulling="2026-02-17 18:12:50.741991877 +0000 UTC m=+1522.386909887" lastFinishedPulling="2026-02-17 18:12:57.212394172 +0000 UTC m=+1528.857312182" observedRunningTime="2026-02-17 18:13:01.733350317 +0000 UTC m=+1533.378268337" watchObservedRunningTime="2026-02-17 18:13:01.741693935 +0000 UTC m=+1533.386611945" Feb 17 18:13:02 crc kubenswrapper[4762]: I0217 18:13:02.729780 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:13:02 crc kubenswrapper[4762]: I0217 18:13:02.731107 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-p6tm7" Feb 17 18:13:03 crc kubenswrapper[4762]: I0217 18:13:03.043743 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" path="/var/lib/kubelet/pods/07d56dc9-b37c-48b5-a7fe-f0856c40e027/volumes" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.558810 4762 patch_prober.go:28] interesting pod/machine-config-daemon-jb9kz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.559160 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.559204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.559830 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b"} pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.559886 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerName="machine-config-daemon" containerID="cri-o://8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" gracePeriod=600 Feb 17 18:13:04 crc kubenswrapper[4762]: E0217 18:13:04.692414 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.711405 4762 generic.go:334] "Generic (PLEG): container finished" podID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" exitCode=0 Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.711454 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerDied","Data":"8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b"} Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.711493 4762 scope.go:117] "RemoveContainer" containerID="dcb7e7b99c1665f4d4f459fb3d5e0f62dcd0b605d5942c6bcbc73ce48dfe3885" Feb 17 18:13:04 crc kubenswrapper[4762]: I0217 18:13:04.712158 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:13:04 crc kubenswrapper[4762]: E0217 18:13:04.712405 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:16 crc kubenswrapper[4762]: I0217 18:13:16.884839 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.027702 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.047158 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.063185 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.254774 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.263243 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/extract/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.268848 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26nxszs_0f17fa55-8aa4-4ae0-9d3b-e1d3f638a6d5/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.429410 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.582786 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.584260 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.590210 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.754178 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/util/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.762248 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/pull/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.780548 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e06xxxdj_88f65670-f91f-492b-bd41-c266624e0664/extract/0.log" Feb 17 18:13:17 crc kubenswrapper[4762]: I0217 18:13:17.913582 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.141664 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.169985 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/pull/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.171762 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/pull/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.333787 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/pull/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.333929 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/extract/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.342542 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9b26ckh_574b2982-5f13-4465-99b9-19a50dd0efd7/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.527511 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.684584 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/pull/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.708511 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.734810 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/pull/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.857584 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/util/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.896592 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/extract/0.log" Feb 17 18:13:18 crc kubenswrapper[4762]: I0217 18:13:18.920548 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed472458qm_7999604c-7cbf-4bd9-9280-fb8d4d047737/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.042616 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:13:19 crc kubenswrapper[4762]: E0217 18:13:19.043273 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.056481 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/util/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.229831 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/util/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.275339 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.282397 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.403614 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/util/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.468108 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.470785 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qnvqs_b8e92bbe-0a6e-470d-8fcb-d774f8ae3660/extract/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.616356 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/util/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.788497 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.790503 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/util/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.797114 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.991794 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/extract/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.993862 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/pull/0.log" Feb 17 18:13:19 crc kubenswrapper[4762]: I0217 18:13:19.999196 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd634fb56_12aa14d1-1ff5-4325-8792-d43cfd40cf96/util/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.141322 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/util/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.265712 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/util/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.297215 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/pull/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.302806 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/pull/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.452724 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/util/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.455599 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/pull/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.497296 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e3bac2c93bd14a7babee012b9ba44ae3f28c9408a1973a5074a31d46fbncjw8_2729907a-9375-4c68-ab91-8470b5e7965f/extract/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.647218 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-jz5wd_6e043c44-ccec-451b-9ba3-505e49d89bce/registry-server/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.721506 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-55b99585d6-r8h5c_bdf1f157-1721-40cf-9c1b-288bb8190904/manager/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.769093 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-678dcfb94b-dlbqc_9cdf848e-625b-4ac0-a1c2-60c34043a95c/manager/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.833479 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-rtqff_0a83620f-b2f0-4ad8-b821-382533a09fc7/registry-server/0.log" Feb 17 18:13:20 crc kubenswrapper[4762]: I0217 18:13:20.920480 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-69b84c89c7-gd74p_05cb543d-eddd-4628-a8bc-168e3a7e5b48/manager/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.022355 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-w5gj7_e476b42e-39ec-4ac9-85c3-b71c41139171/registry-server/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.273387 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-74688bd7c7-pzbvn_36598dd3-5ec9-43b7-9752-85fff598e285/manager/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.349301 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-j2hm8_66cbf86e-4179-4923-9177-343729807287/registry-server/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.420485 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-848b445c8d-6w6cv_2b8ef1ff-c11a-4f67-a717-5e93f9fdfa4d/manager/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.497281 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-q298f_fc9ddf77-1b5c-4e67-8c36-f1b8ce9d9693/registry-server/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.550279 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-v4s4n_3c5f4f80-b6f2-47d5-a966-2f19b2911a99/operator/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.605572 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-9j27d_2a8ca2b8-ee46-4ebf-a619-8fcdab8d2c61/registry-server/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.766499 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5b455594df-pl8hb_d69b17f8-8fea-4129-b57c-5e67d1d0602a/manager/0.log" Feb 17 18:13:21 crc kubenswrapper[4762]: I0217 18:13:21.821070 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-g9bhw_46ed6271-2100-4c3b-a832-062d50f2311d/registry-server/0.log" Feb 17 18:13:28 crc kubenswrapper[4762]: I0217 18:13:28.030135 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-20e3-account-create-update-7j87t"] Feb 17 18:13:28 crc kubenswrapper[4762]: I0217 18:13:28.036161 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-g8r6d"] Feb 17 18:13:28 crc kubenswrapper[4762]: I0217 18:13:28.041933 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-20e3-account-create-update-7j87t"] Feb 17 18:13:28 crc kubenswrapper[4762]: I0217 18:13:28.048554 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-g8r6d"] Feb 17 18:13:29 crc kubenswrapper[4762]: I0217 18:13:29.046154 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ff6d53-9898-4d90-92aa-693f03bf528a" path="/var/lib/kubelet/pods/25ff6d53-9898-4d90-92aa-693f03bf528a/volumes" Feb 17 18:13:29 crc kubenswrapper[4762]: I0217 18:13:29.048346 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5bdeb25-310e-4aaf-8998-a5b7188cb179" path="/var/lib/kubelet/pods/f5bdeb25-310e-4aaf-8998-a5b7188cb179/volumes" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.036166 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:13:30 crc kubenswrapper[4762]: E0217 18:13:30.036807 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.336360 4762 scope.go:117] "RemoveContainer" containerID="5141b833e80fc94fe26eca5a9a02fe35e13356b4349a60200d849800087da384" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.377522 4762 scope.go:117] "RemoveContainer" containerID="9fa3f922e6b52efd7355b0422bd544df3d08e7f319a51a832f1515640f925378" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.395652 4762 scope.go:117] "RemoveContainer" containerID="4d85cb21881277c951bf6629c2b4569b90ea2f745c45c732745891e793fbbef5" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.425063 4762 scope.go:117] "RemoveContainer" containerID="7d80608f4a85df1912eda533b8a4ab3de2a71d0d1d0f9d315c6a22f95e94d866" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.451611 4762 scope.go:117] "RemoveContainer" containerID="525eb9d6d7c9ffab15788d1267ed86abb27f2342ac89bf655ce220be8de5bbfc" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.500648 4762 scope.go:117] "RemoveContainer" containerID="1934288ccee311898ca6cad67c02aabfbdb12d8996abd531912d73a8b49217d0" Feb 17 18:13:30 crc kubenswrapper[4762]: I0217 18:13:30.524756 4762 scope.go:117] "RemoveContainer" containerID="15118eab78bcab3ac2650f43733287003bf13d265d2829a514a170080991b346" Feb 17 18:13:35 crc kubenswrapper[4762]: I0217 18:13:35.764790 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4ttdt_266896ca-532c-45be-b263-727feed4415f/control-plane-machine-set-operator/0.log" Feb 17 18:13:35 crc kubenswrapper[4762]: I0217 18:13:35.916573 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smpx4_27402239-9191-42d8-89b6-8c0e12e54497/kube-rbac-proxy/0.log" Feb 17 18:13:35 crc kubenswrapper[4762]: I0217 18:13:35.948029 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smpx4_27402239-9191-42d8-89b6-8c0e12e54497/machine-api-operator/0.log" Feb 17 18:13:41 crc kubenswrapper[4762]: I0217 18:13:41.036294 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:13:41 crc kubenswrapper[4762]: E0217 18:13:41.037089 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.001378 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:42 crc kubenswrapper[4762]: E0217 18:13:42.001992 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="extract-content" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.002005 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="extract-content" Feb 17 18:13:42 crc kubenswrapper[4762]: E0217 18:13:42.002033 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="extract-utilities" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.002040 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="extract-utilities" Feb 17 18:13:42 crc kubenswrapper[4762]: E0217 18:13:42.002056 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="registry-server" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.002063 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="registry-server" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.002199 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d56dc9-b37c-48b5-a7fe-f0856c40e027" containerName="registry-server" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.003168 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.020600 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.171833 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdmv\" (UniqueName: \"kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.171905 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.173406 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.275550 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdmv\" (UniqueName: \"kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.276158 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.276308 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.276905 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.277598 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.304817 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdmv\" (UniqueName: \"kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv\") pod \"certified-operators-wbldg\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:42 crc kubenswrapper[4762]: I0217 18:13:42.479609 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:43 crc kubenswrapper[4762]: I0217 18:13:43.004642 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:43 crc kubenswrapper[4762]: W0217 18:13:43.020414 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf8128c_af54_40ad_ad1f_8a0114c679fd.slice/crio-f23793a36265c102e51c62fcd00d09505ffc0b5deb2d5747af22cc531357402a WatchSource:0}: Error finding container f23793a36265c102e51c62fcd00d09505ffc0b5deb2d5747af22cc531357402a: Status 404 returned error can't find the container with id f23793a36265c102e51c62fcd00d09505ffc0b5deb2d5747af22cc531357402a Feb 17 18:13:43 crc kubenswrapper[4762]: I0217 18:13:43.974179 4762 generic.go:334] "Generic (PLEG): container finished" podID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerID="3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb" exitCode=0 Feb 17 18:13:43 crc kubenswrapper[4762]: I0217 18:13:43.974222 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerDied","Data":"3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb"} Feb 17 18:13:43 crc kubenswrapper[4762]: I0217 18:13:43.974263 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerStarted","Data":"f23793a36265c102e51c62fcd00d09505ffc0b5deb2d5747af22cc531357402a"} Feb 17 18:13:45 crc kubenswrapper[4762]: I0217 18:13:45.044989 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-6zgfp"] Feb 17 18:13:45 crc kubenswrapper[4762]: I0217 18:13:45.050845 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-6zgfp"] Feb 17 18:13:45 crc kubenswrapper[4762]: I0217 18:13:45.991069 4762 generic.go:334] "Generic (PLEG): container finished" podID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerID="27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38" exitCode=0 Feb 17 18:13:45 crc kubenswrapper[4762]: I0217 18:13:45.991118 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerDied","Data":"27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38"} Feb 17 18:13:46 crc kubenswrapper[4762]: I0217 18:13:46.999571 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerStarted","Data":"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc"} Feb 17 18:13:47 crc kubenswrapper[4762]: I0217 18:13:47.028186 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbldg" podStartSLOduration=3.5849086100000003 podStartE2EDuration="6.028165586s" podCreationTimestamp="2026-02-17 18:13:41 +0000 UTC" firstStartedPulling="2026-02-17 18:13:43.976515408 +0000 UTC m=+1575.621433418" lastFinishedPulling="2026-02-17 18:13:46.419772384 +0000 UTC m=+1578.064690394" observedRunningTime="2026-02-17 18:13:47.023507483 +0000 UTC m=+1578.668425493" watchObservedRunningTime="2026-02-17 18:13:47.028165586 +0000 UTC m=+1578.673083616" Feb 17 18:13:47 crc kubenswrapper[4762]: I0217 18:13:47.045814 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1af73f3-931c-4417-ab51-c2888ae6a593" path="/var/lib/kubelet/pods/d1af73f3-931c-4417-ab51-c2888ae6a593/volumes" Feb 17 18:13:51 crc kubenswrapper[4762]: I0217 18:13:51.028228 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-nz8q8"] Feb 17 18:13:51 crc kubenswrapper[4762]: I0217 18:13:51.045012 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-nz8q8"] Feb 17 18:13:52 crc kubenswrapper[4762]: I0217 18:13:52.480017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:52 crc kubenswrapper[4762]: I0217 18:13:52.480365 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:52 crc kubenswrapper[4762]: I0217 18:13:52.520044 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:53 crc kubenswrapper[4762]: I0217 18:13:53.044114 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8a297e-a079-4043-93d2-7a5e2574003c" path="/var/lib/kubelet/pods/ec8a297e-a079-4043-93d2-7a5e2574003c/volumes" Feb 17 18:13:53 crc kubenswrapper[4762]: I0217 18:13:53.076855 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:53 crc kubenswrapper[4762]: I0217 18:13:53.117834 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.050263 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbldg" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="registry-server" containerID="cri-o://e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc" gracePeriod=2 Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.434253 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.483978 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrdmv\" (UniqueName: \"kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv\") pod \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.484055 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content\") pod \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.484108 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities\") pod \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\" (UID: \"fcf8128c-af54-40ad-ad1f-8a0114c679fd\") " Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.485050 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities" (OuterVolumeSpecName: "utilities") pod "fcf8128c-af54-40ad-ad1f-8a0114c679fd" (UID: "fcf8128c-af54-40ad-ad1f-8a0114c679fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.490818 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv" (OuterVolumeSpecName: "kube-api-access-qrdmv") pod "fcf8128c-af54-40ad-ad1f-8a0114c679fd" (UID: "fcf8128c-af54-40ad-ad1f-8a0114c679fd"). InnerVolumeSpecName "kube-api-access-qrdmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.546990 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcf8128c-af54-40ad-ad1f-8a0114c679fd" (UID: "fcf8128c-af54-40ad-ad1f-8a0114c679fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.585541 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrdmv\" (UniqueName: \"kubernetes.io/projected/fcf8128c-af54-40ad-ad1f-8a0114c679fd-kube-api-access-qrdmv\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.585577 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:55 crc kubenswrapper[4762]: I0217 18:13:55.585594 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf8128c-af54-40ad-ad1f-8a0114c679fd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.036224 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:13:56 crc kubenswrapper[4762]: E0217 18:13:56.036737 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.058781 4762 generic.go:334] "Generic (PLEG): container finished" podID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerID="e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc" exitCode=0 Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.058820 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerDied","Data":"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc"} Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.058842 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbldg" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.058862 4762 scope.go:117] "RemoveContainer" containerID="e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.058848 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbldg" event={"ID":"fcf8128c-af54-40ad-ad1f-8a0114c679fd","Type":"ContainerDied","Data":"f23793a36265c102e51c62fcd00d09505ffc0b5deb2d5747af22cc531357402a"} Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.086377 4762 scope.go:117] "RemoveContainer" containerID="27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.108797 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.109724 4762 scope.go:117] "RemoveContainer" containerID="3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.114765 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbldg"] Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.142735 4762 scope.go:117] "RemoveContainer" containerID="e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc" Feb 17 18:13:56 crc kubenswrapper[4762]: E0217 18:13:56.143230 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc\": container with ID starting with e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc not found: ID does not exist" containerID="e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.143271 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc"} err="failed to get container status \"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc\": rpc error: code = NotFound desc = could not find container \"e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc\": container with ID starting with e039b19b7ecf0357fb147d6c2853ef1cc9c4fa0155827c67fedd418882b6e4bc not found: ID does not exist" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.143292 4762 scope.go:117] "RemoveContainer" containerID="27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38" Feb 17 18:13:56 crc kubenswrapper[4762]: E0217 18:13:56.143698 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38\": container with ID starting with 27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38 not found: ID does not exist" containerID="27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.143742 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38"} err="failed to get container status \"27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38\": rpc error: code = NotFound desc = could not find container \"27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38\": container with ID starting with 27fb9f65a0bc48f640da34a188ab6150bebbfcd20add8a40b7b8474fb2163a38 not found: ID does not exist" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.143762 4762 scope.go:117] "RemoveContainer" containerID="3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb" Feb 17 18:13:56 crc kubenswrapper[4762]: E0217 18:13:56.144088 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb\": container with ID starting with 3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb not found: ID does not exist" containerID="3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb" Feb 17 18:13:56 crc kubenswrapper[4762]: I0217 18:13:56.144118 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb"} err="failed to get container status \"3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb\": rpc error: code = NotFound desc = could not find container \"3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb\": container with ID starting with 3eb687122e9c0d4ad5f0355f786057e653829a5f3090e2871503dfc7eed3dcdb not found: ID does not exist" Feb 17 18:13:57 crc kubenswrapper[4762]: I0217 18:13:57.044151 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" path="/var/lib/kubelet/pods/fcf8128c-af54-40ad-ad1f-8a0114c679fd/volumes" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.023523 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-n248r_242cfeca-c170-4125-8784-ffdf74df96d5/kube-rbac-proxy/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.130374 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-n248r_242cfeca-c170-4125-8784-ffdf74df96d5/controller/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.236694 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-frr-files/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.397387 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-metrics/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.409108 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-reloader/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.414300 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-reloader/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.448419 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-frr-files/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.652201 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-reloader/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.664210 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-metrics/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.695012 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-frr-files/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.705341 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-metrics/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.836853 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-frr-files/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.848076 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-reloader/0.log" Feb 17 18:14:04 crc kubenswrapper[4762]: I0217 18:14:04.978950 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/cp-metrics/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.035187 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/controller/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.173846 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/frr-metrics/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.215275 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/kube-rbac-proxy/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.260187 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/kube-rbac-proxy-frr/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.395301 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/reloader/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.490690 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-v84sn_b25f9642-b43c-436a-821d-383a0912cd63/frr-k8s-webhook-server/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.719012 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fb2tl_6b32c016-322c-462b-b41d-c880ce8bd1ac/frr/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.753905 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-796c5cd795-qwv74_ea3ffdb1-8694-4cc4-90df-653c25a14fac/manager/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.858799 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-85df54ff8f-pfcdh_adbe61a0-9505-4f77-9775-fc8559ae1231/webhook-server/0.log" Feb 17 18:14:05 crc kubenswrapper[4762]: I0217 18:14:05.983815 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mdv5x_feadf162-5dc5-42c5-9c7e-b36a1659213b/kube-rbac-proxy/0.log" Feb 17 18:14:06 crc kubenswrapper[4762]: I0217 18:14:06.125196 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mdv5x_feadf162-5dc5-42c5-9c7e-b36a1659213b/speaker/0.log" Feb 17 18:14:11 crc kubenswrapper[4762]: I0217 18:14:11.035672 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:14:11 crc kubenswrapper[4762]: E0217 18:14:11.036198 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.400315 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-5f79-account-create-update-khjfs_b61af31a-dda6-45a3-97e7-d2c5271235e3/mariadb-account-create-update/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.553048 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-create-7b8c4_0aa2f592-8607-4156-8a42-e3b2f0d5ab50/mariadb-database-create/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.601522 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-sync-cpzsw_cb5ca87d-b094-4631-a254-f190fa5c5822/glance-db-sync/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.764794 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_4994f27b-c494-4a5e-8867-1d3f3ee6a766/glance-httpd/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.796352 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_4994f27b-c494-4a5e-8867-1d3f3ee6a766/glance-log/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.832852 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-1_3419cbbe-0b7e-4c04-925f-1a741ff25114/glance-httpd/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.944100 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-1_3419cbbe-0b7e-4c04-925f-1a741ff25114/glance-log/0.log" Feb 17 18:14:19 crc kubenswrapper[4762]: I0217 18:14:19.975670 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_6020f61b-1c5c-4266-941c-6b18ce30c5c7/glance-httpd/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.017830 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_6020f61b-1c5c-4266-941c-6b18ce30c5c7/glance-log/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.127566 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-1_08bd38b3-2e1b-4517-b07c-4c027b71f9fc/glance-log/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.128727 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-1_08bd38b3-2e1b-4517-b07c-4c027b71f9fc/glance-httpd/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.585669 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_1f247f60-b429-4a5b-81c5-61f533de7ef9/mysql-bootstrap/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.750475 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-5948fd7fc9-pkz2m_696388b8-20ed-48cc-98fa-117526c518da/keystone-api/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.767752 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_1f247f60-b429-4a5b-81c5-61f533de7ef9/mysql-bootstrap/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.854835 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_1f247f60-b429-4a5b-81c5-61f533de7ef9/galera/0.log" Feb 17 18:14:20 crc kubenswrapper[4762]: I0217 18:14:20.947291 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_c0dd6fbc-c7a8-46fe-aceb-25e59e083854/mysql-bootstrap/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.125920 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_c0dd6fbc-c7a8-46fe-aceb-25e59e083854/galera/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.159850 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_c0dd6fbc-c7a8-46fe-aceb-25e59e083854/mysql-bootstrap/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.348522 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f8e941eb-7039-4a71-88df-914907d84acb/mysql-bootstrap/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.593243 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f8e941eb-7039-4a71-88df-914907d84acb/galera/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.627282 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f8e941eb-7039-4a71-88df-914907d84acb/mysql-bootstrap/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.777899 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_eb0f2fa4-3b42-480e-b4c9-76d81b32a758/openstackclient/0.log" Feb 17 18:14:21 crc kubenswrapper[4762]: I0217 18:14:21.875615 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_d9a34938-3950-4fa5-a14d-30feb52b752e/setup-container/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.083357 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_d9a34938-3950-4fa5-a14d-30feb52b752e/setup-container/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.088220 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_d9a34938-3950-4fa5-a14d-30feb52b752e/rabbitmq/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.267352 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-5f6df75b65-p6tm7_e576e3fe-21e1-4867-adcc-bb586e3a5921/proxy-server/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.276550 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-5f6df75b65-p6tm7_e576e3fe-21e1-4867-adcc-bb586e3a5921/proxy-httpd/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.323132 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_bc139701-f0d8-4dd3-8724-69e3e8f42e5f/memcached/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.426095 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-drbf5_f94a7cb6-015a-4a94-8a90-b34d2790a272/swift-ring-rebalance/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.476567 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/account-auditor/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.476618 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/account-reaper/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.593725 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/account-replicator/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.676739 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/account-server/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.678727 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/container-replicator/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.685082 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/container-auditor/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.804083 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/container-server/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.854555 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/container-updater/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.859568 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/object-auditor/0.log" Feb 17 18:14:22 crc kubenswrapper[4762]: I0217 18:14:22.875546 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/object-expirer/0.log" Feb 17 18:14:23 crc kubenswrapper[4762]: I0217 18:14:23.016213 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/object-replicator/0.log" Feb 17 18:14:23 crc kubenswrapper[4762]: I0217 18:14:23.016839 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/object-updater/0.log" Feb 17 18:14:23 crc kubenswrapper[4762]: I0217 18:14:23.017036 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/object-server/0.log" Feb 17 18:14:23 crc kubenswrapper[4762]: I0217 18:14:23.040178 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/rsync/0.log" Feb 17 18:14:23 crc kubenswrapper[4762]: I0217 18:14:23.186186 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ae866fa5-748d-4935-a3d2-2fe08bc9693f/swift-recon-cron/0.log" Feb 17 18:14:26 crc kubenswrapper[4762]: I0217 18:14:26.035956 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:14:26 crc kubenswrapper[4762]: E0217 18:14:26.036578 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.662089 4762 scope.go:117] "RemoveContainer" containerID="9b1036170c641db2caf2c5258948a281094ee79924851b898529cd3836fd63ed" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.681219 4762 scope.go:117] "RemoveContainer" containerID="cbde26d443a29e776a74a416e19d59ce7e75ae17968d5a66feab8bcfeaab175b" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.716845 4762 scope.go:117] "RemoveContainer" containerID="7b6b45ba53310f8445d55cfc438f70edcc9eb4466c5151f9ce1cec45780b1ee2" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.748762 4762 scope.go:117] "RemoveContainer" containerID="7803c4d689dacc7d0a85e7769534ae2060fd784783d1b7844c4de1238094ae94" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.771935 4762 scope.go:117] "RemoveContainer" containerID="a0ba6fbf6af590f2dc3c16f6fd2a262a84344179a208628745a3f7139e65ba9b" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.802954 4762 scope.go:117] "RemoveContainer" containerID="55428f9c6f2ee74dd75d55f682f6c58aa8bc98e97bf3b6088476063fffc1b761" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.859654 4762 scope.go:117] "RemoveContainer" containerID="0bd686b0459fa641e36370acbb19207cdbb705a1cd5fc72480a4efa530b44028" Feb 17 18:14:30 crc kubenswrapper[4762]: I0217 18:14:30.876409 4762 scope.go:117] "RemoveContainer" containerID="3aae3e7920896ef3ead49c930801f94439b89f8f6a52d6233e451463c5e0431a" Feb 17 18:14:34 crc kubenswrapper[4762]: I0217 18:14:34.688399 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/util/0.log" Feb 17 18:14:34 crc kubenswrapper[4762]: I0217 18:14:34.909300 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/pull/0.log" Feb 17 18:14:34 crc kubenswrapper[4762]: I0217 18:14:34.909865 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/pull/0.log" Feb 17 18:14:34 crc kubenswrapper[4762]: I0217 18:14:34.912031 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/util/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.112174 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/extract/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.115613 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/pull/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.131255 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b8lgd_d89f05a2-322d-448a-91a0-c193c28943a1/util/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.277505 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-utilities/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.447613 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-utilities/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.451234 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-content/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.451265 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-content/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.618035 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-content/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.647450 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/extract-utilities/0.log" Feb 17 18:14:35 crc kubenswrapper[4762]: I0217 18:14:35.859219 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-utilities/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.098489 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-content/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.138308 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-utilities/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.175285 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-content/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.192514 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbswz_45a1e640-3aeb-47f7-8a26-a578cf7d7c18/registry-server/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.315614 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-content/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.325022 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/extract-utilities/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.512399 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4mh4k_ae055f49-1dcf-4008-85fe-2f3ca1d45a75/marketplace-operator/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.667690 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-utilities/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.851312 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7bzhc_18a63ac5-9c0b-4b15-96ea-7bb2d166525e/registry-server/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.892286 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-content/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.920355 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-content/0.log" Feb 17 18:14:36 crc kubenswrapper[4762]: I0217 18:14:36.924718 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-utilities/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.072035 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-utilities/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.085730 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/extract-content/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.187527 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69hrp_8a52b4d8-7eba-4af4-850d-565a3136fc8c/registry-server/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.240529 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-utilities/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.443083 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-utilities/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.443093 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-content/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.466147 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-content/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.614226 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-utilities/0.log" Feb 17 18:14:37 crc kubenswrapper[4762]: I0217 18:14:37.672935 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/extract-content/0.log" Feb 17 18:14:38 crc kubenswrapper[4762]: I0217 18:14:38.035372 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:14:38 crc kubenswrapper[4762]: E0217 18:14:38.035574 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:14:38 crc kubenswrapper[4762]: I0217 18:14:38.150244 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gcnq_209bf713-7d49-4554-96bd-4922d360dbe7/registry-server/0.log" Feb 17 18:14:50 crc kubenswrapper[4762]: I0217 18:14:50.036440 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:14:50 crc kubenswrapper[4762]: E0217 18:14:50.037807 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.136113 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q"] Feb 17 18:15:00 crc kubenswrapper[4762]: E0217 18:15:00.136988 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="extract-utilities" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.137007 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="extract-utilities" Feb 17 18:15:00 crc kubenswrapper[4762]: E0217 18:15:00.137057 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.137065 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4762]: E0217 18:15:00.137094 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="extract-content" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.137103 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="extract-content" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.137350 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf8128c-af54-40ad-ad1f-8a0114c679fd" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.138036 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.140398 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.140744 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.148314 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q"] Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.183588 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgcxm\" (UniqueName: \"kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.183883 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.183940 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.284997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.285043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.285120 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgcxm\" (UniqueName: \"kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.286664 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.291669 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.304463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgcxm\" (UniqueName: \"kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm\") pod \"collect-profiles-29522535-sds6q\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.461004 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:00 crc kubenswrapper[4762]: I0217 18:15:00.865238 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q"] Feb 17 18:15:00 crc kubenswrapper[4762]: W0217 18:15:00.872300 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6acad49c_c672_4b68_9236_7da8ec791783.slice/crio-fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259 WatchSource:0}: Error finding container fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259: Status 404 returned error can't find the container with id fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259 Feb 17 18:15:01 crc kubenswrapper[4762]: I0217 18:15:01.515992 4762 generic.go:334] "Generic (PLEG): container finished" podID="6acad49c-c672-4b68-9236-7da8ec791783" containerID="aa16d57789de0b345781c46a704b31c9d6ba370e3b2ea991dd0da4a3a4c1110d" exitCode=0 Feb 17 18:15:01 crc kubenswrapper[4762]: I0217 18:15:01.516329 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" event={"ID":"6acad49c-c672-4b68-9236-7da8ec791783","Type":"ContainerDied","Data":"aa16d57789de0b345781c46a704b31c9d6ba370e3b2ea991dd0da4a3a4c1110d"} Feb 17 18:15:01 crc kubenswrapper[4762]: I0217 18:15:01.516361 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" event={"ID":"6acad49c-c672-4b68-9236-7da8ec791783","Type":"ContainerStarted","Data":"fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259"} Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.779719 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.824295 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume\") pod \"6acad49c-c672-4b68-9236-7da8ec791783\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.824350 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume\") pod \"6acad49c-c672-4b68-9236-7da8ec791783\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.824469 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgcxm\" (UniqueName: \"kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm\") pod \"6acad49c-c672-4b68-9236-7da8ec791783\" (UID: \"6acad49c-c672-4b68-9236-7da8ec791783\") " Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.824913 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume" (OuterVolumeSpecName: "config-volume") pod "6acad49c-c672-4b68-9236-7da8ec791783" (UID: "6acad49c-c672-4b68-9236-7da8ec791783"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.829848 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6acad49c-c672-4b68-9236-7da8ec791783" (UID: "6acad49c-c672-4b68-9236-7da8ec791783"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.830561 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm" (OuterVolumeSpecName: "kube-api-access-lgcxm") pod "6acad49c-c672-4b68-9236-7da8ec791783" (UID: "6acad49c-c672-4b68-9236-7da8ec791783"). InnerVolumeSpecName "kube-api-access-lgcxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.926330 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6acad49c-c672-4b68-9236-7da8ec791783-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.926392 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6acad49c-c672-4b68-9236-7da8ec791783-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:02 crc kubenswrapper[4762]: I0217 18:15:02.926408 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgcxm\" (UniqueName: \"kubernetes.io/projected/6acad49c-c672-4b68-9236-7da8ec791783-kube-api-access-lgcxm\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:03 crc kubenswrapper[4762]: I0217 18:15:03.533879 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" event={"ID":"6acad49c-c672-4b68-9236-7da8ec791783","Type":"ContainerDied","Data":"fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259"} Feb 17 18:15:03 crc kubenswrapper[4762]: I0217 18:15:03.533928 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe012bf8676b37e1db42be0d1603854a3f719033d302731c6deb7cab9301a259" Feb 17 18:15:03 crc kubenswrapper[4762]: I0217 18:15:03.533934 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-sds6q" Feb 17 18:15:04 crc kubenswrapper[4762]: I0217 18:15:04.035727 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:15:04 crc kubenswrapper[4762]: E0217 18:15:04.035991 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:15:17 crc kubenswrapper[4762]: I0217 18:15:17.038410 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:15:17 crc kubenswrapper[4762]: E0217 18:15:17.039298 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:15:29 crc kubenswrapper[4762]: I0217 18:15:29.042481 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:15:29 crc kubenswrapper[4762]: E0217 18:15:29.043598 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:15:31 crc kubenswrapper[4762]: I0217 18:15:31.007448 4762 scope.go:117] "RemoveContainer" containerID="510ea20aff458e12a1c9f80ec2d286bfe370ed701ead3fe9bb442614a8bf9a73" Feb 17 18:15:31 crc kubenswrapper[4762]: I0217 18:15:31.050306 4762 scope.go:117] "RemoveContainer" containerID="7563462fc17d75735df5ca31ed1b7a309d849aa9bf9199ad2271cff1a5460924" Feb 17 18:15:42 crc kubenswrapper[4762]: I0217 18:15:42.037327 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:15:42 crc kubenswrapper[4762]: E0217 18:15:42.038350 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:15:52 crc kubenswrapper[4762]: I0217 18:15:52.907060 4762 generic.go:334] "Generic (PLEG): container finished" podID="41630b6a-bae3-4e2b-bd82-ad7c75056f70" containerID="7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013" exitCode=0 Feb 17 18:15:52 crc kubenswrapper[4762]: I0217 18:15:52.907149 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g79x5/must-gather-xs4ng" event={"ID":"41630b6a-bae3-4e2b-bd82-ad7c75056f70","Type":"ContainerDied","Data":"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013"} Feb 17 18:15:52 crc kubenswrapper[4762]: I0217 18:15:52.908364 4762 scope.go:117] "RemoveContainer" containerID="7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013" Feb 17 18:15:53 crc kubenswrapper[4762]: I0217 18:15:53.311955 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g79x5_must-gather-xs4ng_41630b6a-bae3-4e2b-bd82-ad7c75056f70/gather/0.log" Feb 17 18:15:56 crc kubenswrapper[4762]: I0217 18:15:56.036070 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:15:56 crc kubenswrapper[4762]: E0217 18:15:56.036485 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.227120 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g79x5/must-gather-xs4ng"] Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.228064 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-g79x5/must-gather-xs4ng" podUID="41630b6a-bae3-4e2b-bd82-ad7c75056f70" containerName="copy" containerID="cri-o://58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae" gracePeriod=2 Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.234169 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g79x5/must-gather-xs4ng"] Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.603803 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g79x5_must-gather-xs4ng_41630b6a-bae3-4e2b-bd82-ad7c75056f70/copy/0.log" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.604433 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.670611 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnnqz\" (UniqueName: \"kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz\") pod \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.670774 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output\") pod \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\" (UID: \"41630b6a-bae3-4e2b-bd82-ad7c75056f70\") " Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.675923 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz" (OuterVolumeSpecName: "kube-api-access-gnnqz") pod "41630b6a-bae3-4e2b-bd82-ad7c75056f70" (UID: "41630b6a-bae3-4e2b-bd82-ad7c75056f70"). InnerVolumeSpecName "kube-api-access-gnnqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.752399 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "41630b6a-bae3-4e2b-bd82-ad7c75056f70" (UID: "41630b6a-bae3-4e2b-bd82-ad7c75056f70"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.772235 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnnqz\" (UniqueName: \"kubernetes.io/projected/41630b6a-bae3-4e2b-bd82-ad7c75056f70-kube-api-access-gnnqz\") on node \"crc\" DevicePath \"\"" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.772277 4762 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41630b6a-bae3-4e2b-bd82-ad7c75056f70-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.963218 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g79x5_must-gather-xs4ng_41630b6a-bae3-4e2b-bd82-ad7c75056f70/copy/0.log" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.963645 4762 generic.go:334] "Generic (PLEG): container finished" podID="41630b6a-bae3-4e2b-bd82-ad7c75056f70" containerID="58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae" exitCode=143 Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.963704 4762 scope.go:117] "RemoveContainer" containerID="58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.963811 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g79x5/must-gather-xs4ng" Feb 17 18:16:00 crc kubenswrapper[4762]: I0217 18:16:00.984569 4762 scope.go:117] "RemoveContainer" containerID="7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013" Feb 17 18:16:01 crc kubenswrapper[4762]: I0217 18:16:01.024876 4762 scope.go:117] "RemoveContainer" containerID="58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae" Feb 17 18:16:01 crc kubenswrapper[4762]: E0217 18:16:01.025440 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae\": container with ID starting with 58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae not found: ID does not exist" containerID="58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae" Feb 17 18:16:01 crc kubenswrapper[4762]: I0217 18:16:01.025483 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae"} err="failed to get container status \"58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae\": rpc error: code = NotFound desc = could not find container \"58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae\": container with ID starting with 58a3a70c36afbdce77d4a605139d851812ba03f459cd18382b96ac030d5bbeae not found: ID does not exist" Feb 17 18:16:01 crc kubenswrapper[4762]: I0217 18:16:01.025510 4762 scope.go:117] "RemoveContainer" containerID="7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013" Feb 17 18:16:01 crc kubenswrapper[4762]: E0217 18:16:01.027118 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013\": container with ID starting with 7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013 not found: ID does not exist" containerID="7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013" Feb 17 18:16:01 crc kubenswrapper[4762]: I0217 18:16:01.027149 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013"} err="failed to get container status \"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013\": rpc error: code = NotFound desc = could not find container \"7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013\": container with ID starting with 7d4a2b901dc95efbea5b70880f955f28da22239b3f53ef69aff0632e3afea013 not found: ID does not exist" Feb 17 18:16:01 crc kubenswrapper[4762]: I0217 18:16:01.046829 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41630b6a-bae3-4e2b-bd82-ad7c75056f70" path="/var/lib/kubelet/pods/41630b6a-bae3-4e2b-bd82-ad7c75056f70/volumes" Feb 17 18:16:07 crc kubenswrapper[4762]: I0217 18:16:07.036163 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:16:07 crc kubenswrapper[4762]: E0217 18:16:07.037533 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:16:21 crc kubenswrapper[4762]: I0217 18:16:21.035794 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:16:21 crc kubenswrapper[4762]: E0217 18:16:21.036566 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:16:31 crc kubenswrapper[4762]: I0217 18:16:31.113344 4762 scope.go:117] "RemoveContainer" containerID="737c3bd012c747cbfd3b4daa6fe52ee9322e7cc31d64b2e47335f49d62877219" Feb 17 18:16:31 crc kubenswrapper[4762]: I0217 18:16:31.143039 4762 scope.go:117] "RemoveContainer" containerID="e1fd2c667d472a338cbffe4a797dca090eb418b87a6e0cc58b0885d1b8252c4f" Feb 17 18:16:31 crc kubenswrapper[4762]: I0217 18:16:31.180016 4762 scope.go:117] "RemoveContainer" containerID="04bb74d115faf16a7128615cd39c11e5e93ab518c8a30409dcaf77a8a9c70bc3" Feb 17 18:16:31 crc kubenswrapper[4762]: I0217 18:16:31.212414 4762 scope.go:117] "RemoveContainer" containerID="d3d0daa38814fac1e03514e3f4f29de2591d070977bbe272db3214fe0cb28257" Feb 17 18:16:33 crc kubenswrapper[4762]: I0217 18:16:33.036393 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:16:33 crc kubenswrapper[4762]: E0217 18:16:33.036714 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:16:48 crc kubenswrapper[4762]: I0217 18:16:48.036395 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:16:48 crc kubenswrapper[4762]: E0217 18:16:48.037294 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:16:59 crc kubenswrapper[4762]: I0217 18:16:59.040209 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:16:59 crc kubenswrapper[4762]: E0217 18:16:59.041545 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:17:12 crc kubenswrapper[4762]: I0217 18:17:12.035910 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:17:12 crc kubenswrapper[4762]: E0217 18:17:12.036728 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:17:26 crc kubenswrapper[4762]: I0217 18:17:26.036984 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:17:26 crc kubenswrapper[4762]: E0217 18:17:26.038015 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:17:40 crc kubenswrapper[4762]: I0217 18:17:40.036496 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:17:40 crc kubenswrapper[4762]: E0217 18:17:40.037468 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:17:55 crc kubenswrapper[4762]: I0217 18:17:55.036099 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:17:55 crc kubenswrapper[4762]: E0217 18:17:55.036939 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jb9kz_openshift-machine-config-operator(7389b1a3-5839-49b0-97e8-2adcbe0fd491)\"" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" podUID="7389b1a3-5839-49b0-97e8-2adcbe0fd491" Feb 17 18:18:08 crc kubenswrapper[4762]: I0217 18:18:08.036430 4762 scope.go:117] "RemoveContainer" containerID="8fdb23fad8a7c58d392c96ab6f55b5df376f9f32bd459d5b3aa13fbaf674a05b" Feb 17 18:18:08 crc kubenswrapper[4762]: I0217 18:18:08.980867 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jb9kz" event={"ID":"7389b1a3-5839-49b0-97e8-2adcbe0fd491","Type":"ContainerStarted","Data":"18e21ec32eca520f77b653493f8bf5c5e58b25e0e62a2bd8ee4481392fab6c08"}